Dec 03 17:00:04 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 17:00:04 crc restorecon[4700]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:04 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 17:00:05 crc restorecon[4700]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 17:00:06 crc kubenswrapper[4841]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.041071 4841 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044811 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044844 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044849 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044854 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044858 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044863 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044867 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044871 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044877 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044883 4841 feature_gate.go:330] unrecognized feature gate: Example Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044889 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044894 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044899 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044929 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044939 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044947 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044954 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044958 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044962 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044966 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044971 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044975 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044979 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044982 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044986 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044990 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044994 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.044998 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045003 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045006 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045010 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045014 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045018 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045021 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045025 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045029 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045033 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045036 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045040 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045045 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045050 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045054 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045058 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045063 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045067 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045072 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045076 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045082 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045086 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045090 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045094 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045097 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045100 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045104 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045107 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045112 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045115 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045119 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045122 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045125 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045129 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045132 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045136 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045140 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045144 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045147 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045152 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045157 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045161 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045164 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.045168 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045272 4841 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045283 4841 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045292 4841 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045298 4841 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045304 4841 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045308 4841 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045315 4841 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045322 4841 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045327 4841 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045331 4841 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045337 4841 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045341 4841 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045345 4841 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045350 4841 flags.go:64] FLAG: --cgroup-root="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045353 4841 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045358 4841 flags.go:64] FLAG: --client-ca-file="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045362 4841 flags.go:64] FLAG: --cloud-config="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045366 4841 flags.go:64] FLAG: --cloud-provider="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045370 4841 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045375 4841 flags.go:64] FLAG: --cluster-domain="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045379 4841 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045384 4841 flags.go:64] FLAG: --config-dir="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045389 4841 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045394 4841 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045400 4841 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045405 4841 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045411 4841 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045416 4841 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045421 4841 flags.go:64] FLAG: --contention-profiling="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045426 4841 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045430 4841 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045435 4841 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045439 4841 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045445 4841 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045449 4841 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045454 4841 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045458 4841 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045462 4841 flags.go:64] FLAG: --enable-server="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045466 4841 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045983 4841 flags.go:64] FLAG: --event-burst="100" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045992 4841 flags.go:64] FLAG: --event-qps="50" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.045998 4841 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046043 4841 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046051 4841 flags.go:64] FLAG: --eviction-hard="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046062 4841 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046069 4841 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046076 4841 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046094 4841 flags.go:64] FLAG: --eviction-soft="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046100 4841 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046105 4841 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046111 4841 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046116 4841 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046122 4841 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046127 4841 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046133 4841 flags.go:64] FLAG: --feature-gates="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046148 4841 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046154 4841 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046162 4841 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046171 4841 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046177 4841 flags.go:64] FLAG: --healthz-port="10248" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046183 4841 flags.go:64] FLAG: --help="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046190 4841 flags.go:64] FLAG: --hostname-override="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046196 4841 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046202 4841 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046258 4841 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046266 4841 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046271 4841 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046278 4841 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046284 4841 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046289 4841 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046294 4841 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046299 4841 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046307 4841 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046315 4841 flags.go:64] FLAG: --kube-reserved="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046320 4841 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046325 4841 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046329 4841 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046335 4841 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046340 4841 flags.go:64] FLAG: --lock-file="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046344 4841 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046350 4841 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046361 4841 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046370 4841 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046375 4841 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046379 4841 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046383 4841 flags.go:64] FLAG: --logging-format="text" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046388 4841 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046394 4841 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046400 4841 flags.go:64] FLAG: --manifest-url="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046406 4841 flags.go:64] FLAG: --manifest-url-header="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046420 4841 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046425 4841 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046431 4841 flags.go:64] FLAG: --max-pods="110" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046437 4841 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046443 4841 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046451 4841 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046457 4841 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046462 4841 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046473 4841 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046480 4841 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046504 4841 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046508 4841 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046513 4841 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046518 4841 flags.go:64] FLAG: --pod-cidr="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046529 4841 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046541 4841 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046546 4841 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046552 4841 flags.go:64] FLAG: --pods-per-core="0" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046556 4841 flags.go:64] FLAG: --port="10250" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046562 4841 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046569 4841 flags.go:64] FLAG: --provider-id="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046575 4841 flags.go:64] FLAG: --qos-reserved="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046581 4841 flags.go:64] FLAG: --read-only-port="10255" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046591 4841 flags.go:64] FLAG: --register-node="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046597 4841 flags.go:64] FLAG: --register-schedulable="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046602 4841 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046613 4841 flags.go:64] FLAG: --registry-burst="10" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046617 4841 flags.go:64] FLAG: --registry-qps="5" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046623 4841 flags.go:64] FLAG: --reserved-cpus="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046627 4841 flags.go:64] FLAG: --reserved-memory="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046643 4841 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046648 4841 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046653 4841 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046658 4841 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046662 4841 flags.go:64] FLAG: --runonce="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046666 4841 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046671 4841 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046676 4841 flags.go:64] FLAG: --seccomp-default="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046680 4841 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046688 4841 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046693 4841 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046698 4841 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046703 4841 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046707 4841 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046711 4841 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046716 4841 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046722 4841 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046728 4841 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046738 4841 flags.go:64] FLAG: --system-cgroups="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046742 4841 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046753 4841 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046757 4841 flags.go:64] FLAG: --tls-cert-file="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046762 4841 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046770 4841 flags.go:64] FLAG: --tls-min-version="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046781 4841 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046786 4841 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046790 4841 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046794 4841 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046800 4841 flags.go:64] FLAG: --v="2" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046807 4841 flags.go:64] FLAG: --version="false" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046816 4841 flags.go:64] FLAG: --vmodule="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046823 4841 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.046835 4841 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047070 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047078 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047083 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047087 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047093 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047097 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047105 4841 feature_gate.go:330] unrecognized feature gate: Example Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047111 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047117 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047122 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047126 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047131 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047135 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047140 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047144 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047149 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047153 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047157 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047164 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047170 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047174 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047179 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047267 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047539 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047556 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047561 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047570 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047575 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047580 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047588 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047596 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047601 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047606 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047612 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047616 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047620 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047625 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047630 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047635 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047640 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047645 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047649 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047654 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047657 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047662 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047666 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047669 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047674 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047678 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047682 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047685 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047690 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047694 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047700 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047704 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047708 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047714 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047718 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047723 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047726 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047730 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047735 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047738 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047743 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047746 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047750 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047753 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047757 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047760 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047764 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.047767 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.047991 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.057933 4841 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.057978 4841 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058101 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058111 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058117 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058122 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058128 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058133 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058138 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058142 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058147 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058152 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058160 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058165 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058169 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058174 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058179 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058184 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058190 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058194 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058199 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058204 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058208 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058213 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058218 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058223 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058228 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058232 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058238 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058242 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058247 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058252 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058257 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058263 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058268 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058273 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058278 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058282 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058287 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058292 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058297 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058302 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058306 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058311 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058319 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058326 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058333 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058338 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058342 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058347 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058353 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058357 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058362 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058367 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058373 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058377 4841 feature_gate.go:330] unrecognized feature gate: Example Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058382 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058387 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058392 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058399 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058405 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058411 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058417 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058422 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058427 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058433 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058438 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058442 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058448 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058453 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058460 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058467 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058472 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.058481 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058630 4841 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058639 4841 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058647 4841 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058653 4841 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058660 4841 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058667 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058674 4841 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058680 4841 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058687 4841 feature_gate.go:330] unrecognized feature gate: Example Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058693 4841 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058699 4841 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058706 4841 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058712 4841 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058719 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058726 4841 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058732 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058740 4841 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058745 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058752 4841 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058758 4841 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058763 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058771 4841 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058778 4841 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058784 4841 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058792 4841 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058798 4841 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058805 4841 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058812 4841 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058818 4841 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058824 4841 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058831 4841 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058837 4841 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058843 4841 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058850 4841 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058859 4841 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058865 4841 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058874 4841 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058882 4841 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058889 4841 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058896 4841 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058946 4841 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058955 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058962 4841 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058968 4841 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058974 4841 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058981 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058988 4841 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.058995 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059002 4841 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059008 4841 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059014 4841 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059020 4841 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059025 4841 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059030 4841 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059036 4841 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059043 4841 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059052 4841 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059059 4841 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059066 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059072 4841 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059080 4841 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059088 4841 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059096 4841 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059102 4841 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059109 4841 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059115 4841 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059121 4841 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059128 4841 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059134 4841 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059140 4841 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.059147 4841 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.059156 4841 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.059691 4841 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.063999 4841 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.064130 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.064865 4841 server.go:997] "Starting client certificate rotation" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.064898 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.065447 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-12 01:50:08.192879532 +0000 UTC Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.065567 4841 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 200h50m2.127319456s for next certificate rotation Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.076526 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.079292 4841 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.096811 4841 log.go:25] "Validated CRI v1 runtime API" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.124886 4841 log.go:25] "Validated CRI v1 image API" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.126781 4841 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.128961 4841 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-16-55-17-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.129010 4841 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:41 fsType:tmpfs blockSize:0}] Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.153890 4841 manager.go:217] Machine: {Timestamp:2025-12-03 17:00:06.151722587 +0000 UTC m=+0.539243384 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:07a56beb-ca95-4540-9e44-8534d93c2a77 BootID:d8830a74-3409-4e59-a7ee-2c2a0b4959ce Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:41 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:d5:b0:90 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:d5:b0:90 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b5:13:36 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:3a:e5:b2 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:48:50:45 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ea:69:3c Speed:-1 Mtu:1496} {Name:eth10 MacAddress:ee:1d:39:8b:12:99 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:56:ca:8f:7a:40:cb Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.154358 4841 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.154660 4841 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.155972 4841 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.156351 4841 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.156437 4841 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.156812 4841 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.156835 4841 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.157056 4841 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.157101 4841 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.157610 4841 state_mem.go:36] "Initialized new in-memory state store" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.157814 4841 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.158946 4841 kubelet.go:418] "Attempting to sync node with API server" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.158993 4841 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.159150 4841 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.159185 4841 kubelet.go:324] "Adding apiserver pod source" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.159235 4841 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.161529 4841 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.162128 4841 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.163439 4841 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164192 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164238 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164255 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164269 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164292 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164305 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164318 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164340 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164363 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164378 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164432 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164445 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.164626 4841 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.165362 4841 server.go:1280] "Started kubelet" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.165689 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.165782 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.165846 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.165952 4841 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.165845 4841 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.166834 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.166868 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.167495 4841 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 17:00:06 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.169120 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.169169 4841 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.169304 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 18:36:53.877748342 +0000 UTC Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.169686 4841 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 1009h36m47.708066407s for next certificate rotation Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.170085 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.176711 4841 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.176775 4841 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.177047 4841 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.170533 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187dc32823c4a8ee default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 17:00:06.165301486 +0000 UTC m=+0.552822254,LastTimestamp:2025-12-03 17:00:06.165301486 +0000 UTC m=+0.552822254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.179306 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.179519 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.179987 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.180052 4841 factory.go:55] Registering systemd factory Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.180346 4841 factory.go:221] Registration of the systemd container factory successfully Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.179741 4841 server.go:460] "Adding debug handlers to kubelet server" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.181121 4841 factory.go:153] Registering CRI-O factory Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.181230 4841 factory.go:221] Registration of the crio container factory successfully Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.181356 4841 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.181436 4841 factory.go:103] Registering Raw factory Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.181481 4841 manager.go:1196] Started watching for new ooms in manager Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.184295 4841 manager.go:319] Starting recovery of all containers Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192545 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192615 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192638 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192659 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192678 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192697 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192716 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192735 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192761 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192804 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192823 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192843 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192863 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192886 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192929 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192950 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192970 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.192987 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193007 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193025 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193044 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193061 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193080 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193100 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193120 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193164 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193188 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193211 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193234 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193274 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193310 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193467 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193488 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193507 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193525 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193545 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193565 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193583 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193601 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193621 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193639 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193660 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193679 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193699 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193718 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193739 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193757 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193775 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.193793 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194767 4841 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194809 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194832 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194857 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194887 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194937 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194961 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.194982 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195002 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195023 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195048 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195074 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195098 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195121 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195146 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195175 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195201 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195226 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195248 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195272 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195291 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195310 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195482 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195531 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195561 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195586 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195609 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195633 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195660 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195686 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195713 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195741 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195766 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195794 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195823 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195853 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195880 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195942 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195973 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.195998 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196022 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196048 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196071 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196094 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196116 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196140 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196162 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196187 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196211 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196237 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196263 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196290 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196315 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196335 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196353 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196372 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196403 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196423 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196454 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196478 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196498 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196520 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196543 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196563 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196582 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196603 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196622 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196641 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196660 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196678 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196696 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196713 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196767 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196787 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196805 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196823 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196840 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196858 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196877 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196894 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196943 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196962 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196980 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.196998 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197018 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197036 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197058 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197089 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197124 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197159 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197189 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197210 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197228 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197249 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197270 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197290 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197308 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197327 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197346 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197367 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197386 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197403 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197422 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197440 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197458 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197478 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197495 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197515 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197532 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197550 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197567 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197585 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197602 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197622 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197640 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197660 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197678 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197696 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197715 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197733 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197751 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197770 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197791 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197810 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197827 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197846 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197863 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197884 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197902 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197951 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197969 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.197988 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198008 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198066 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198086 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198112 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198132 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198150 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198169 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198186 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198205 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198223 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198240 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198259 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198277 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198296 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198315 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198334 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198359 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198377 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198508 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198528 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198545 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198563 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198584 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198601 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198626 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198653 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198681 4841 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198704 4841 reconstruct.go:97] "Volume reconstruction finished" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.198721 4841 reconciler.go:26] "Reconciler: start to sync state" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.213410 4841 manager.go:324] Recovery completed Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.231826 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.235130 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.235318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.235384 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.235512 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.236743 4841 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.236781 4841 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.236823 4841 state_mem.go:36] "Initialized new in-memory state store" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.237424 4841 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.237495 4841 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.237529 4841 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.237603 4841 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.238801 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.238883 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.246756 4841 policy_none.go:49] "None policy: Start" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.247812 4841 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.247856 4841 state_mem.go:35] "Initializing new in-memory state store" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.271090 4841 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.311332 4841 manager.go:334] "Starting Device Plugin manager" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.311470 4841 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.311497 4841 server.go:79] "Starting device plugin registration server" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.312379 4841 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.312423 4841 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.312740 4841 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.312857 4841 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.312876 4841 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.324993 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.338189 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc"] Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.338409 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.340543 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.340625 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.340853 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.341178 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.341630 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.341742 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.342780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.342846 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.342865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343136 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343335 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343465 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343554 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.343585 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344729 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344765 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344788 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344860 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344937 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.344970 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.345253 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.345337 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346095 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346144 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346165 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346348 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.346677 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347156 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347445 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347498 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347788 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347860 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.347950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.348383 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.348418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.348435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.380328 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401283 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401300 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401345 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401359 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401399 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401460 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401600 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.401728 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.412988 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.414742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.414778 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.414790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.414870 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.415375 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504111 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504206 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504391 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504418 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504567 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504563 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504612 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504715 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504779 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504761 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504818 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.504974 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.505025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.505035 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.505100 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.505248 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.616442 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.619842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.619887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.619900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.619945 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.620291 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.671019 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.679362 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.686604 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.714502 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: I1203 17:00:06.719427 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.731952 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-31282e87b13e7b3ebdbb3fd142a3c9412663a22886dea1b2eaffcd70c9e37da1 WatchSource:0}: Error finding container 31282e87b13e7b3ebdbb3fd142a3c9412663a22886dea1b2eaffcd70c9e37da1: Status 404 returned error can't find the container with id 31282e87b13e7b3ebdbb3fd142a3c9412663a22886dea1b2eaffcd70c9e37da1 Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.732473 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-d68d77fe1f3d385b2d1e144cd3fc3bf9ff490449ebddd0d010abdc03c2572206 WatchSource:0}: Error finding container d68d77fe1f3d385b2d1e144cd3fc3bf9ff490449ebddd0d010abdc03c2572206: Status 404 returned error can't find the container with id d68d77fe1f3d385b2d1e144cd3fc3bf9ff490449ebddd0d010abdc03c2572206 Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.740260 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-62069f92c4801dac8e141556117d7e0dd2a26efac8b4e6913f06dd2f04b44d58 WatchSource:0}: Error finding container 62069f92c4801dac8e141556117d7e0dd2a26efac8b4e6913f06dd2f04b44d58: Status 404 returned error can't find the container with id 62069f92c4801dac8e141556117d7e0dd2a26efac8b4e6913f06dd2f04b44d58 Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.754340 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-aa177b1175ed8b9b3720fa37452b4f856df25636cc4256fbcb11cafc0bef8d44 WatchSource:0}: Error finding container aa177b1175ed8b9b3720fa37452b4f856df25636cc4256fbcb11cafc0bef8d44: Status 404 returned error can't find the container with id aa177b1175ed8b9b3720fa37452b4f856df25636cc4256fbcb11cafc0bef8d44 Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.781842 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 03 17:00:06 crc kubenswrapper[4841]: W1203 17:00:06.994390 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:06 crc kubenswrapper[4841]: E1203 17:00:06.994487 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.021015 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.022508 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.022548 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.022563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.022589 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.022865 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.167580 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:07 crc kubenswrapper[4841]: W1203 17:00:07.206154 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.206242 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.243054 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2" exitCode=0 Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.243140 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.243265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d68d77fe1f3d385b2d1e144cd3fc3bf9ff490449ebddd0d010abdc03c2572206"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.243383 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.244545 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.244577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.244587 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.245203 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b3186e3a027dfa49893f1449aae38cd97f397cb6321edfa7d84b47773a05a0eb" exitCode=0 Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.245275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b3186e3a027dfa49893f1449aae38cd97f397cb6321edfa7d84b47773a05a0eb"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.245299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c9016ff346f5b471b55e60d6f7a0c1484f4ac047f3513d699572c26e30133a4"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.245377 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.246245 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.246274 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.246287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.247191 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.247683 4841 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="5a8db49894f499fcec6b1155ad90c6bea7f4be0796772119fd3a546029b26f75" exitCode=0 Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.247706 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"5a8db49894f499fcec6b1155ad90c6bea7f4be0796772119fd3a546029b26f75"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.247749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"aa177b1175ed8b9b3720fa37452b4f856df25636cc4256fbcb11cafc0bef8d44"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.247819 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.248731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.248782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.248803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.249041 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.249122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.249141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.255219 4841 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36" exitCode=0 Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.255316 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.255423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"62069f92c4801dac8e141556117d7e0dd2a26efac8b4e6913f06dd2f04b44d58"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.255612 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.256974 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.257020 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.257042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.258299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3"} Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.258365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"31282e87b13e7b3ebdbb3fd142a3c9412663a22886dea1b2eaffcd70c9e37da1"} Dec 03 17:00:07 crc kubenswrapper[4841]: W1203 17:00:07.277063 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.277150 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.583857 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 03 17:00:07 crc kubenswrapper[4841]: W1203 17:00:07.623676 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.623966 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.151:6443: connect: connection refused" logger="UnhandledError" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.823040 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.824892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.824980 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.825018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:07 crc kubenswrapper[4841]: I1203 17:00:07.825089 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:07 crc kubenswrapper[4841]: E1203 17:00:07.825836 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.151:6443: connect: connection refused" node="crc" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.168106 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.151:6443: connect: connection refused Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.263513 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="e471bcba1e8ef9be69fd0490cb31e19be4ffb571dafd9a336e568a0710e136f8" exitCode=0 Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.263627 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"e471bcba1e8ef9be69fd0490cb31e19be4ffb571dafd9a336e568a0710e136f8"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.263801 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.265387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.265440 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.265452 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.267983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"db590e388e260f41f701ef0195db012a92624ed4b81facaffb990edcbd011dcc"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.268203 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.269460 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.269500 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.269536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.271392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.271439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.271459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.271567 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.272583 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.272635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.272652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.275421 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.275467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.275489 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.275590 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.276571 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.276597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.276609 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.282142 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.282188 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.282208 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.282221 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783"} Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.710810 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:08 crc kubenswrapper[4841]: I1203 17:00:08.719391 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.287400 4841 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="7663232f63adf8c741f35740996dae5dfd1e1e22f6003856622469a57b790b62" exitCode=0 Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.287500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"7663232f63adf8c741f35740996dae5dfd1e1e22f6003856622469a57b790b62"} Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.287782 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.290650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.290746 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.290770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.292988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76"} Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.293089 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.293138 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294483 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294504 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294444 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.294629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.426510 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.427800 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.427853 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.427871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.427931 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:09 crc kubenswrapper[4841]: I1203 17:00:09.855043 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.300871 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"45f4ff573ee4d1a429cbe8b08331c2fb3be072670ddfdf79600a731a98a21e0c"} Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.301350 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.301376 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"477772853dfbd38f2a8e28b7df4ee1c25e53e67fd6208dc64fe2442d276ea7c4"} Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.301395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"642c075b014c2e8527882be99d5ac5c3c94c6af09c6df5b4cc753ae2fe218c95"} Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.301042 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.301042 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.302997 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.303065 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.303081 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.303716 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.303782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.303801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.369613 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.369889 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.371053 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.371092 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:10 crc kubenswrapper[4841]: I1203 17:00:10.371103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.307390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"da48e441f0d68ef47c15ccdaefc3e391c2f5bc1d9f75c1e7f478cca976e58995"} Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.307450 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.307459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33a99f0e03ce3172dd4c73b0ddd697666e725c8e7a5c27d2c0bd7ee2066ce12a"} Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.307476 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.307539 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.308391 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.308411 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.308418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.308968 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.309002 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.309016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.309022 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.309049 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:11 crc kubenswrapper[4841]: I1203 17:00:11.309027 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.126970 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.310008 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.311006 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.311142 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.311278 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.311295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.312809 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.312851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.312868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.710168 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.710370 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.711670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.711844 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:12 crc kubenswrapper[4841]: I1203 17:00:12.712013 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.614146 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.614449 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.616356 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.616439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.616462 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.794319 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.794510 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.795639 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.795669 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:13 crc kubenswrapper[4841]: I1203 17:00:13.795677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:15 crc kubenswrapper[4841]: I1203 17:00:15.485720 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 17:00:15 crc kubenswrapper[4841]: I1203 17:00:15.486003 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:15 crc kubenswrapper[4841]: I1203 17:00:15.487848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:15 crc kubenswrapper[4841]: I1203 17:00:15.487954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:15 crc kubenswrapper[4841]: I1203 17:00:15.487984 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:16 crc kubenswrapper[4841]: E1203 17:00:16.325118 4841 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 17:00:16 crc kubenswrapper[4841]: I1203 17:00:16.614556 4841 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 17:00:16 crc kubenswrapper[4841]: I1203 17:00:16.614654 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:00:18 crc kubenswrapper[4841]: W1203 17:00:18.928630 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 17:00:18 crc kubenswrapper[4841]: I1203 17:00:18.928748 4841 trace.go:236] Trace[1384861394]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 17:00:08.927) (total time: 10001ms): Dec 03 17:00:18 crc kubenswrapper[4841]: Trace[1384861394]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:00:18.928) Dec 03 17:00:18 crc kubenswrapper[4841]: Trace[1384861394]: [10.001272348s] [10.001272348s] END Dec 03 17:00:18 crc kubenswrapper[4841]: E1203 17:00:18.928785 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.168744 4841 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 17:00:19 crc kubenswrapper[4841]: E1203 17:00:19.185379 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 17:00:19 crc kubenswrapper[4841]: W1203 17:00:19.283027 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.283130 4841 trace.go:236] Trace[590294098]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 17:00:09.280) (total time: 10002ms): Dec 03 17:00:19 crc kubenswrapper[4841]: Trace[590294098]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10002ms (17:00:19.283) Dec 03 17:00:19 crc kubenswrapper[4841]: Trace[590294098]: [10.002129547s] [10.002129547s] END Dec 03 17:00:19 crc kubenswrapper[4841]: E1203 17:00:19.283153 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 17:00:19 crc kubenswrapper[4841]: E1203 17:00:19.429499 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 03 17:00:19 crc kubenswrapper[4841]: W1203 17:00:19.477709 4841 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.477810 4841 trace.go:236] Trace[686851812]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 17:00:09.475) (total time: 10002ms): Dec 03 17:00:19 crc kubenswrapper[4841]: Trace[686851812]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (17:00:19.477) Dec 03 17:00:19 crc kubenswrapper[4841]: Trace[686851812]: [10.002026624s] [10.002026624s] END Dec 03 17:00:19 crc kubenswrapper[4841]: E1203 17:00:19.477835 4841 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.627043 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.627135 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.638497 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.638571 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.862129 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.862262 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.863835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.863871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:19 crc kubenswrapper[4841]: I1203 17:00:19.863882 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.266427 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.266700 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.267895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.268182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.268210 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.290172 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.334048 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.334855 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.334896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.334940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:21 crc kubenswrapper[4841]: I1203 17:00:21.351082 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.337047 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.338383 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.338438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.338455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.629981 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.631507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.631569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.631596 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.631633 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:22 crc kubenswrapper[4841]: E1203 17:00:22.636898 4841 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.715656 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.715833 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.716930 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.716996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.717011 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:22 crc kubenswrapper[4841]: I1203 17:00:22.719894 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.010255 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.339063 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.340320 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.340410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.340438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:23 crc kubenswrapper[4841]: I1203 17:00:23.813340 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.638238 4841 trace.go:236] Trace[2087074542]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 17:00:10.811) (total time: 13827ms): Dec 03 17:00:24 crc kubenswrapper[4841]: Trace[2087074542]: ---"Objects listed" error: 13827ms (17:00:24.638) Dec 03 17:00:24 crc kubenswrapper[4841]: Trace[2087074542]: [13.827116405s] [13.827116405s] END Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.638300 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.640079 4841 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.679241 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.679410 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.680405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.680450 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.680463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682202 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682224 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": EOF" start-of-body= Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682294 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682238 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": EOF" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682817 4841 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.682866 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Dec 03 17:00:24 crc kubenswrapper[4841]: I1203 17:00:24.684414 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:00:25 crc kubenswrapper[4841]: I1203 17:00:25.186630 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 17:00:25 crc kubenswrapper[4841]: I1203 17:00:25.345611 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 17:00:25 crc kubenswrapper[4841]: I1203 17:00:25.347712 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76" exitCode=255 Dec 03 17:00:25 crc kubenswrapper[4841]: I1203 17:00:25.347787 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76"} Dec 03 17:00:25 crc kubenswrapper[4841]: I1203 17:00:25.360845 4841 scope.go:117] "RemoveContainer" containerID="51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.175855 4841 apiserver.go:52] "Watching apiserver" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.179922 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.180222 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.180572 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.180672 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.180732 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.180813 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.181374 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.181305 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.183726 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.183811 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184235 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.184249 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184503 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184607 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184761 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.184874 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.185080 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.185477 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.188811 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.216759 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.235316 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.248369 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.263706 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.277933 4841 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.281673 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.310832 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.338805 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350576 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350611 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350627 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350646 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350695 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350708 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350723 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350740 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350755 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350784 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350799 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350813 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350853 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350883 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350928 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.350984 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351006 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351033 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351122 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351189 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351208 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351224 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351239 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351256 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351276 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351314 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351346 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351371 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351385 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351400 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351413 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351428 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351443 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351473 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351486 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351516 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351531 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351546 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351627 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351655 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351661 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351680 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351777 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.351976 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352049 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352058 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352074 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352108 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352263 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352270 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352291 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352297 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352332 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352363 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352397 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352435 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352468 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352534 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352551 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352565 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352599 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352630 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352666 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352697 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352730 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352845 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352862 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352899 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.352999 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353029 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353104 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353129 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353154 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353176 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353241 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353287 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353310 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353332 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353384 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353409 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353434 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353530 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353552 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353577 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353604 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353629 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353654 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353703 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353752 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353775 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353798 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353823 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353844 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353898 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353942 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353992 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354042 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354066 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354089 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354168 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354218 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354242 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354263 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354285 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354308 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354354 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354375 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354465 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354495 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354529 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354555 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354581 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354614 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354638 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354663 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354685 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354710 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354734 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354758 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354783 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354811 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354837 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354862 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354885 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354926 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354967 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354991 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355041 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355069 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355095 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355170 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355194 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355246 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355269 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355327 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355355 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355379 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355401 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355424 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355471 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355515 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355539 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355560 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355584 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355607 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355631 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355656 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355681 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355707 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355731 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355756 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355780 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355831 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355875 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355958 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355984 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356012 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356067 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356093 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356196 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356247 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356301 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356315 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356329 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356347 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356359 4841 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356376 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356389 4841 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356402 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356415 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.357050 4841 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.359108 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.369405 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371004 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd"} Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371047 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377252 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378779 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353046 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353204 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353368 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.353836 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354073 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354264 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.354792 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355042 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355216 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355458 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.355943 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356186 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356358 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.356948 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.357117 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.357306 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.357539 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.357889 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.358119 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.359430 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.359673 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.360066 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.360389 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361022 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361379 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361607 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.361891 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362087 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362338 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362348 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362535 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362569 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.362685 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363124 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363403 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363653 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.363733 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.364039 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.364237 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.364388 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.364850 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.365196 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.365384 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.365594 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.365883 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.365989 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:26.865971774 +0000 UTC m=+21.253492501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.366175 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.366300 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.366613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.366706 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.367294 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.367413 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.367598 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.367762 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.367869 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.368192 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.368148 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.368563 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.368750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.369085 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.369721 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.370721 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.370962 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371442 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371450 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371513 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371808 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371892 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.371963 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372169 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372207 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372218 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372258 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372437 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372542 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.372919 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373181 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373250 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373488 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373538 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373473 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.373730 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374152 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374214 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374250 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374348 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374268 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374702 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.374793 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.375086 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.375966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376305 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376378 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376382 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376483 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376720 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376772 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376857 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.376784 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.377012 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377184 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377175 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377334 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377829 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377879 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.377982 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378051 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378055 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378227 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378306 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.378561 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.378762 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379150 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379237 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379117 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379773 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.379800 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380082 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380195 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380342 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380602 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380636 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380594 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.380863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381261 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381352 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.381481 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:26.881459023 +0000 UTC m=+21.268979750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.382356 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.382421 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381518 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381554 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381517 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.381926 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.382110 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.382153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.382231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.383203 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:26.883173063 +0000 UTC m=+21.270693790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.383557 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.383687 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.383768 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.383804 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.386357 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.392982 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.393375 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.393574 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.393604 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.393617 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.393679 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:26.893661343 +0000 UTC m=+21.281182070 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.396675 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.397272 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.397272 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.397725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.399135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.399370 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.399386 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.399611 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.400215 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.401844 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.402401 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.407992 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.408020 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.408032 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.408082 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:26.908066256 +0000 UTC m=+21.295586983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.409370 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.409626 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.410567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.410733 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.411292 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.411347 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.411485 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.411920 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.412118 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.413365 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.413748 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.413815 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.415151 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.421183 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.422355 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.423009 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.425962 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.431247 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.434809 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.441063 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.450720 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457085 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457232 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457255 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457273 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457289 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457305 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457315 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457322 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457392 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457403 4841 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457414 4841 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457423 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457432 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457441 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457496 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457505 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457513 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457523 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457532 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457542 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457602 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457613 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457623 4841 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457634 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457643 4841 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457652 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457660 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457668 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457677 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457685 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457694 4841 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457703 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457713 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457722 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457731 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457740 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457755 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457764 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457772 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457780 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457789 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457797 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457805 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457813 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457821 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457830 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457838 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457847 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457856 4841 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457864 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457872 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457880 4841 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457889 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457897 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457936 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457945 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457954 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457963 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457971 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457979 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457987 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.457995 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458005 4841 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458013 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458022 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458030 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458038 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458046 4841 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458053 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458061 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458069 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458077 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458085 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458092 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458100 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458108 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458116 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458124 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458142 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458151 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458160 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458168 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458177 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458186 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458195 4841 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458204 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458215 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458224 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458232 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458246 4841 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458254 4841 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458263 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458272 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458282 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458291 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458299 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458308 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458317 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458332 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458341 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458350 4841 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458359 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458368 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458377 4841 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458386 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458394 4841 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458402 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458412 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458421 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458430 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458439 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458448 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458458 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458470 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458478 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458487 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458496 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458505 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458515 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458524 4841 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458534 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458542 4841 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458551 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458561 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458569 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458578 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458587 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458596 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458606 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458614 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458623 4841 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458632 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458641 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458650 4841 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458660 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458669 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458677 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458685 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458693 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458702 4841 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458711 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458720 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458728 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458738 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458747 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458757 4841 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458766 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458774 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458783 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458791 4841 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458800 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458810 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458819 4841 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458830 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458847 4841 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458861 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458872 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458884 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458895 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458979 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.458992 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459003 4841 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459016 4841 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459028 4841 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459041 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459065 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459077 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459089 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459112 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459126 4841 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459139 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459153 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459165 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459178 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459191 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459202 4841 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459213 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459411 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459421 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459432 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459444 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459454 4841 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459463 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459473 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459483 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459493 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.459511 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.461413 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.471542 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.483367 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.496676 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.503975 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.512155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.513050 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 17:00:26 crc kubenswrapper[4841]: W1203 17:00:26.522109 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-fe500c9792ad2b81e2f8a86bd456c2215cfcb2ab4ec918fe168fcee2d3e61afe WatchSource:0}: Error finding container fe500c9792ad2b81e2f8a86bd456c2215cfcb2ab4ec918fe168fcee2d3e61afe: Status 404 returned error can't find the container with id fe500c9792ad2b81e2f8a86bd456c2215cfcb2ab4ec918fe168fcee2d3e61afe Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.525983 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.531949 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.536394 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.545646 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: W1203 17:00:26.555483 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d9be217d88f831777023e87d9911618c2bbe5dc701bb1cc3e99196df5ca29342 WatchSource:0}: Error finding container d9be217d88f831777023e87d9911618c2bbe5dc701bb1cc3e99196df5ca29342: Status 404 returned error can't find the container with id d9be217d88f831777023e87d9911618c2bbe5dc701bb1cc3e99196df5ca29342 Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.556148 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.568676 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.962175 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.962251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.962281 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962327 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:27.962306698 +0000 UTC m=+22.349827425 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.962351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:26 crc kubenswrapper[4841]: I1203 17:00:26.962375 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962397 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962439 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962458 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962472 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962480 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962447 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962516 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962525 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962500 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:27.962473182 +0000 UTC m=+22.349993959 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962567 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:27.962555794 +0000 UTC m=+22.350076601 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962585 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:27.962576895 +0000 UTC m=+22.350097712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:26 crc kubenswrapper[4841]: E1203 17:00:26.962599 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:27.962591085 +0000 UTC m=+22.350111902 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.238787 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.238846 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.238971 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.239059 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.374320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.374366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"fe500c9792ad2b81e2f8a86bd456c2215cfcb2ab4ec918fe168fcee2d3e61afe"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.375462 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d9be217d88f831777023e87d9911618c2bbe5dc701bb1cc3e99196df5ca29342"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.377463 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.377517 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.377532 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"79aee7231e02658fb0dc8546b86de113313d19ac27a7ccd5b28cb85db7f0d935"} Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.391955 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.404304 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.417191 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.430256 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.443181 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.456558 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.469644 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.480585 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.492701 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.505022 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.520511 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.537425 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.555855 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.566939 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.587164 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.596397 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:27Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.971121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.971202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.971231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.971260 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:27 crc kubenswrapper[4841]: I1203 17:00:27.971282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971380 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971401 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971425 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971437 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971445 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:29.971412875 +0000 UTC m=+24.358933602 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971462 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971487 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971504 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971506 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971489 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:29.971472367 +0000 UTC m=+24.358993094 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971566 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:29.971544389 +0000 UTC m=+24.359065196 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971603 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:29.971574799 +0000 UTC m=+24.359095526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:27 crc kubenswrapper[4841]: E1203 17:00:27.971621 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:29.97161487 +0000 UTC m=+24.359135597 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.238677 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:28 crc kubenswrapper[4841]: E1203 17:00:28.239158 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.244331 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.246163 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.248726 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.249389 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.249978 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.250429 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.251013 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.251506 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.252094 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.252571 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.253058 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.253661 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.254122 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.254612 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.255099 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.255589 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.256104 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.256456 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.257003 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.257522 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.260575 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.261196 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.261592 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.262521 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.262928 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.263870 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.264498 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.265482 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.266019 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.266830 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.267278 4841 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.267375 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.269263 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.269753 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.270154 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.271571 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.272526 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.273038 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.274004 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.274648 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.275494 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.276062 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.276952 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.277501 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.278284 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.278855 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.279848 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.280790 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.281801 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.282345 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.283417 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.284008 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.284546 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 17:00:28 crc kubenswrapper[4841]: I1203 17:00:28.285355 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.037178 4841 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.038599 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.038635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.038646 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.038699 4841 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.045208 4841 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.045424 4841 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.046256 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.046287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.046296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.046309 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.046319 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.064230 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.067358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.067397 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.067407 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.067421 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.067430 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.077829 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.080876 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.080947 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.080961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.080977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.080985 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.093515 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.096575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.096605 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.096613 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.096637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.096646 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.109211 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.112342 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.112429 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.112442 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.112459 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.112487 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.126035 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.126157 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.127830 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.127866 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.127878 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.127893 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.127922 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.232743 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.232779 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.232789 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.232803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.232812 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.238481 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.238514 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.238576 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:29 crc kubenswrapper[4841]: E1203 17:00:29.238698 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.334514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.334550 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.334560 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.334576 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.334589 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.344133 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-86vxf"] Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.344431 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-qwsc4"] Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.344582 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.344677 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.344600 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-qpptb"] Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.345715 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-c9kmk"] Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.345859 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.346084 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.347108 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.347410 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.347973 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.355300 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.355458 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.356395 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.356494 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.357686 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.358424 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.358829 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.362677 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.362960 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.363057 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.363100 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.363065 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.383298 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.384520 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.409697 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.437780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.437818 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.437826 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.437843 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.437853 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.438235 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.453179 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.471379 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482242 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-socket-dir-parent\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-mcd-auth-proxy-config\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-cni-binary-copy\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-rootfs\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482344 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-k8s-cni-cncf-io\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482358 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-kubelet\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-multus-certs\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482396 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-system-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482533 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-bin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-multus\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482654 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-cnibin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-hostroot\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dsfc\" (UniqueName: \"kubernetes.io/projected/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-kube-api-access-8dsfc\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-netns\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482777 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-multus-daemon-config\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gc8l\" (UniqueName: \"kubernetes.io/projected/0752d936-15ef-4e17-8463-3185a4c1863b-kube-api-access-4gc8l\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482848 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2mz6\" (UniqueName: \"kubernetes.io/projected/46566bd5-34f0-4858-9a20-3a78c292e4ba-kube-api-access-b2mz6\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-os-release\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482900 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-cnibin\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482936 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-proxy-tls\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482976 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn4dk\" (UniqueName: \"kubernetes.io/projected/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-kube-api-access-vn4dk\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.482989 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-system-cni-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.483026 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-conf-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.483043 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-hosts-file\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.483085 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-etc-kubernetes\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.483101 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.483126 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-os-release\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.485458 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.497601 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.511808 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.528564 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.538094 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.542531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.542569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.542580 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.542597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.542608 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.554015 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.567478 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.577678 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-multus-daemon-config\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gc8l\" (UniqueName: \"kubernetes.io/projected/0752d936-15ef-4e17-8463-3185a4c1863b-kube-api-access-4gc8l\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584319 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2mz6\" (UniqueName: \"kubernetes.io/projected/46566bd5-34f0-4858-9a20-3a78c292e4ba-kube-api-access-b2mz6\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584350 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-os-release\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584371 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-cnibin\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584390 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584410 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-proxy-tls\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584436 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn4dk\" (UniqueName: \"kubernetes.io/projected/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-kube-api-access-vn4dk\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-system-cni-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-conf-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584498 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-hosts-file\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-etc-kubernetes\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584518 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-cnibin\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584548 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584623 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-os-release\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584655 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-hosts-file\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-socket-dir-parent\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584702 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-system-cni-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584714 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-mcd-auth-proxy-config\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584727 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-etc-kubernetes\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584769 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-os-release\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584807 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-socket-dir-parent\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-cni-binary-copy\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584834 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-os-release\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-rootfs\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584854 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-rootfs\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584879 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-k8s-cni-cncf-io\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584896 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-kubelet\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-multus-certs\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-system-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-k8s-cni-cncf-io\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-bin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-kubelet\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584988 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-multus-certs\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.584991 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-multus\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-bin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-var-lib-cni-multus\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-system-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585030 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-cnibin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585056 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-cnibin\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-hostroot\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dsfc\" (UniqueName: \"kubernetes.io/projected/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-kube-api-access-8dsfc\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585114 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-netns\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585149 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585239 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-cni-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585247 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-host-run-netns\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585297 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-hostroot\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-multus-daemon-config\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585600 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-mcd-auth-proxy-config\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585647 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0752d936-15ef-4e17-8463-3185a4c1863b-multus-conf-dir\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0752d936-15ef-4e17-8463-3185a4c1863b-cni-binary-copy\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-binary-copy\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/46566bd5-34f0-4858-9a20-3a78c292e4ba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.585969 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/46566bd5-34f0-4858-9a20-3a78c292e4ba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.591197 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.595138 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-proxy-tls\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.601121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dsfc\" (UniqueName: \"kubernetes.io/projected/2cd214d0-d838-44a7-8a1a-ef7855cc1bd3-kube-api-access-8dsfc\") pod \"machine-config-daemon-c9kmk\" (UID: \"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\") " pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.601481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gc8l\" (UniqueName: \"kubernetes.io/projected/0752d936-15ef-4e17-8463-3185a4c1863b-kube-api-access-4gc8l\") pod \"multus-qwsc4\" (UID: \"0752d936-15ef-4e17-8463-3185a4c1863b\") " pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.601522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2mz6\" (UniqueName: \"kubernetes.io/projected/46566bd5-34f0-4858-9a20-3a78c292e4ba-kube-api-access-b2mz6\") pod \"multus-additional-cni-plugins-qpptb\" (UID: \"46566bd5-34f0-4858-9a20-3a78c292e4ba\") " pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.602927 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn4dk\" (UniqueName: \"kubernetes.io/projected/e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6-kube-api-access-vn4dk\") pod \"node-resolver-86vxf\" (UID: \"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\") " pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.604982 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.617363 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.629850 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.643524 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.645127 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.645163 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.645175 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.645191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.645204 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.655186 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.658871 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qwsc4" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.669152 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-86vxf" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.673434 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.683821 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:00:29 crc kubenswrapper[4841]: W1203 17:00:29.684322 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0752d936_15ef_4e17_8463_3185a4c1863b.slice/crio-551ce88681814f161809c9f6847833489be53c8f5259fa08baed79164b24f80b WatchSource:0}: Error finding container 551ce88681814f161809c9f6847833489be53c8f5259fa08baed79164b24f80b: Status 404 returned error can't find the container with id 551ce88681814f161809c9f6847833489be53c8f5259fa08baed79164b24f80b Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.688637 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-qpptb" Dec 03 17:00:29 crc kubenswrapper[4841]: W1203 17:00:29.695792 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cd214d0_d838_44a7_8a1a_ef7855cc1bd3.slice/crio-d95a6aabec3e0469f1bd3ce631ffa30221f3e23b6be8fda17927b409f8d59554 WatchSource:0}: Error finding container d95a6aabec3e0469f1bd3ce631ffa30221f3e23b6be8fda17927b409f8d59554: Status 404 returned error can't find the container with id d95a6aabec3e0469f1bd3ce631ffa30221f3e23b6be8fda17927b409f8d59554 Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.702093 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.721167 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.748732 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.748770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.748780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.748794 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.748805 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.773288 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5svt"] Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.774296 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.776504 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.776686 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.776739 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.776740 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.776786 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.777330 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.780575 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.791991 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.832231 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.860522 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.860561 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.860571 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.860586 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.860597 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.870805 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.903206 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905333 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905469 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905490 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905564 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905589 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905638 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.905735 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgj9j\" (UniqueName: \"kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.935123 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.960501 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.963656 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.963688 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.963696 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.963709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.963718 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:29Z","lastTransitionTime":"2025-12-03T17:00:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.972342 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.984271 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:29 crc kubenswrapper[4841]: I1203 17:00:29.997750 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:29Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.006548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.006716 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:34.006688625 +0000 UTC m=+28.394209352 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.006994 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007224 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007258 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xgj9j\" (UniqueName: \"kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007281 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007306 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007336 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007410 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007432 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007451 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007535 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007614 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007661 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007684 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007703 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007771 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007800 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007826 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007853 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007882 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.007977 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008080 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008092 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008101 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008133 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:34.008124339 +0000 UTC m=+28.395645066 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008495 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008539 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:34.008529579 +0000 UTC m=+28.396050306 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008569 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008594 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008589 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008679 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.008685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008734 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:34.008722613 +0000 UTC m=+28.396243340 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008682 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008817 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008840 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.008930 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:34.008892967 +0000 UTC m=+28.396413754 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.010037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.010901 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.011684 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.024231 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.031253 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgj9j\" (UniqueName: \"kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j\") pod \"ovnkube-node-d5svt\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.034628 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.047618 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.066171 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.066207 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.066216 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.066230 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.066239 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.087255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:30 crc kubenswrapper[4841]: W1203 17:00:30.099401 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1853b500_b218_4412_9cbc_9fd0a76778c0.slice/crio-da34b91f09c49509e843256b513d328bb83f8b3f9364e0a11f24b39df1a668ca WatchSource:0}: Error finding container da34b91f09c49509e843256b513d328bb83f8b3f9364e0a11f24b39df1a668ca: Status 404 returned error can't find the container with id da34b91f09c49509e843256b513d328bb83f8b3f9364e0a11f24b39df1a668ca Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.168127 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.168164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.168173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.168187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.168197 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.238739 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:30 crc kubenswrapper[4841]: E1203 17:00:30.238878 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.270924 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.270958 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.270966 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.270980 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.270992 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.376301 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.376335 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.376350 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.376365 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.376376 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.389436 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" exitCode=0 Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.389505 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.389552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"da34b91f09c49509e843256b513d328bb83f8b3f9364e0a11f24b39df1a668ca"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.390997 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86vxf" event={"ID":"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6","Type":"ContainerStarted","Data":"96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.391042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-86vxf" event={"ID":"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6","Type":"ContainerStarted","Data":"9ee80c101b6b47a9b4d8a6c0f56e6f56e34aa02b627db144839573cc4904ac90"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.395296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerStarted","Data":"4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.395350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerStarted","Data":"551ce88681814f161809c9f6847833489be53c8f5259fa08baed79164b24f80b"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.404992 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7" exitCode=0 Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.405094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.405175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerStarted","Data":"1dd6c5623fd06f70cbbab4f13e9be526feaad4a10519680026ac65dacde26ce2"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.408396 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.408484 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.408502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"d95a6aabec3e0469f1bd3ce631ffa30221f3e23b6be8fda17927b409f8d59554"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.412242 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.438423 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.448658 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.462422 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.483490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.483920 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.483754 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.483934 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.484080 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.484102 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.497849 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.509200 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.522427 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.536288 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.551704 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.566214 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.578509 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.586044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.586067 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.586075 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.586088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.586099 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.591143 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.602466 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.615784 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.629086 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.639865 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.651215 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.669626 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.683288 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.687797 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.687829 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.687839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.687854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.687865 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.696226 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.707120 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.719826 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.731622 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.743755 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.761292 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:30Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.789673 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.789709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.789718 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.789733 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.789744 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.892444 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.892894 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.893010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.893086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.893158 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.997358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.997458 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.997470 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.997516 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:30 crc kubenswrapper[4841]: I1203 17:00:30.997532 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:30Z","lastTransitionTime":"2025-12-03T17:00:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.103265 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.103298 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.103310 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.103326 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.103337 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.205041 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.205074 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.205083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.205097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.205105 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.237715 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.237783 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:31 crc kubenswrapper[4841]: E1203 17:00:31.237848 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:31 crc kubenswrapper[4841]: E1203 17:00:31.237962 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.307922 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.307975 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.307988 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.308004 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.308017 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.410534 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.410865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.410876 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.410892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.410918 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.414584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerStarted","Data":"9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.417680 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.417702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.417714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.417724 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.417738 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.431689 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.447214 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.458183 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.468979 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.482633 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.496922 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.510116 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.512699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.512725 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.512736 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.512755 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.512767 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.525059 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.545205 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.560596 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.585761 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.596653 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.610520 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.615331 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.615376 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.615395 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.615419 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.615436 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.656373 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-bglx8"] Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.656764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.659749 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.660849 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.660855 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.661217 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.673141 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.688303 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.706019 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.717363 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.717399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.717410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.717427 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.717438 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.719545 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.725280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dac136e3-68d3-410c-b010-a7509fffb25c-host\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.725343 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvdx9\" (UniqueName: \"kubernetes.io/projected/dac136e3-68d3-410c-b010-a7509fffb25c-kube-api-access-kvdx9\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.725377 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dac136e3-68d3-410c-b010-a7509fffb25c-serviceca\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.730148 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.742600 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.756523 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.767514 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.784733 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.795111 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.805660 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.816257 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.819801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.819828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.819839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.819856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.819867 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.826312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dac136e3-68d3-410c-b010-a7509fffb25c-host\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.826381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvdx9\" (UniqueName: \"kubernetes.io/projected/dac136e3-68d3-410c-b010-a7509fffb25c-kube-api-access-kvdx9\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.826414 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dac136e3-68d3-410c-b010-a7509fffb25c-serviceca\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.826383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dac136e3-68d3-410c-b010-a7509fffb25c-host\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.827446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dac136e3-68d3-410c-b010-a7509fffb25c-serviceca\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.827835 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.843471 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:31Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.847470 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvdx9\" (UniqueName: \"kubernetes.io/projected/dac136e3-68d3-410c-b010-a7509fffb25c-kube-api-access-kvdx9\") pod \"node-ca-bglx8\" (UID: \"dac136e3-68d3-410c-b010-a7509fffb25c\") " pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.921213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.921252 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.921260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.921274 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.921286 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:31Z","lastTransitionTime":"2025-12-03T17:00:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:31 crc kubenswrapper[4841]: I1203 17:00:31.969797 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-bglx8" Dec 03 17:00:31 crc kubenswrapper[4841]: W1203 17:00:31.987671 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddac136e3_68d3_410c_b010_a7509fffb25c.slice/crio-f62ff004658c757c547f544022a87d32fa4110dde9d71c5e895833046feae808 WatchSource:0}: Error finding container f62ff004658c757c547f544022a87d32fa4110dde9d71c5e895833046feae808: Status 404 returned error can't find the container with id f62ff004658c757c547f544022a87d32fa4110dde9d71c5e895833046feae808 Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.024040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.024078 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.024089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.024104 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.024116 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.127758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.127802 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.127813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.127830 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.127842 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.230961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.231243 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.231251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.231264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.231272 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.238319 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:32 crc kubenswrapper[4841]: E1203 17:00:32.238428 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.334227 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.334286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.334297 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.334320 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.334337 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.423479 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bglx8" event={"ID":"dac136e3-68d3-410c-b010-a7509fffb25c","Type":"ContainerStarted","Data":"ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.423549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-bglx8" event={"ID":"dac136e3-68d3-410c-b010-a7509fffb25c","Type":"ContainerStarted","Data":"f62ff004658c757c547f544022a87d32fa4110dde9d71c5e895833046feae808"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.425945 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be" exitCode=0 Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.426035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.432975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.437253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.437304 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.437356 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.437379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.437398 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.444786 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.459224 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.474768 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.489697 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.505852 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.522885 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.536883 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.540628 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.541367 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.541667 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.541691 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.541702 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.551873 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.563443 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.574551 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.586309 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.596553 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.609597 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.629619 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.643243 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.644542 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.644593 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.644609 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.644632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.644647 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.659448 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.671265 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.683436 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.704212 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.715985 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.726152 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.741183 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.746706 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.746747 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.746762 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.746779 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.746791 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.753241 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.767146 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.778535 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.791513 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.801815 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.812804 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:32Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.848495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.848530 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.848539 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.848553 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.848562 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.950872 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.950928 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.950940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.950956 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:32 crc kubenswrapper[4841]: I1203 17:00:32.950969 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:32Z","lastTransitionTime":"2025-12-03T17:00:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.053572 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.053670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.053698 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.053734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.053761 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.155999 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.156043 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.156055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.156073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.156085 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.238770 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.238784 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:33 crc kubenswrapper[4841]: E1203 17:00:33.239003 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:33 crc kubenswrapper[4841]: E1203 17:00:33.239073 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.260123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.260191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.260209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.260235 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.260251 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.363408 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.363467 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.363477 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.363500 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.363513 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.438975 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08" exitCode=0 Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.439026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.455284 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466022 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466067 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466107 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.466691 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.484374 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.504323 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.518152 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.535341 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.547774 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.560012 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.568831 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.568872 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.568893 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.568925 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.568937 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.572724 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.584486 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.599933 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.609136 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.623457 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.636719 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:33Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.671273 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.671305 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.671315 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.671330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.671339 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.774205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.774258 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.774267 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.774283 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.774298 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.888316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.888439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.888460 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.888484 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.888498 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.992123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.992195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.992214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.992241 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:33 crc kubenswrapper[4841]: I1203 17:00:33.992260 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:33Z","lastTransitionTime":"2025-12-03T17:00:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.049819 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.050032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050131 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:42.050092838 +0000 UTC m=+36.437613595 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050177 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050202 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050220 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.050222 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050280 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:42.050261692 +0000 UTC m=+36.437782459 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.050310 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.050364 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050454 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050507 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050524 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:42.050503928 +0000 UTC m=+36.438024685 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050589 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:42.050563749 +0000 UTC m=+36.438084516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050672 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050698 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050719 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.050790 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:42.050770604 +0000 UTC m=+36.438291371 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.095800 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.095865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.095887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.095975 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.096047 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.199950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.200041 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.200068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.200096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.200120 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.238536 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:34 crc kubenswrapper[4841]: E1203 17:00:34.238762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.303086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.303146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.303164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.303191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.303213 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.406681 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.406744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.406760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.406785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.406801 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.445720 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6" exitCode=0 Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.445820 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.456648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.465461 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.486456 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.500674 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.510066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.510102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.510111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.510125 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.510135 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.513466 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.524664 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.536786 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.552470 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.564782 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.580523 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.595769 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.609309 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.614118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.614175 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.614189 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.614214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.614230 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.621251 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.630950 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.643212 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:34Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.718088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.718130 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.718139 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.718157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.718172 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.821592 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.821661 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.821684 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.821714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.821738 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.923690 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.923741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.923754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.923777 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:34 crc kubenswrapper[4841]: I1203 17:00:34.923790 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:34Z","lastTransitionTime":"2025-12-03T17:00:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.027164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.027215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.027229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.027251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.027270 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.130505 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.130561 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.130573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.130599 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.130616 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.232634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.232680 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.232692 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.232713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.232725 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.237974 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.238022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:35 crc kubenswrapper[4841]: E1203 17:00:35.238106 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:35 crc kubenswrapper[4841]: E1203 17:00:35.238189 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.335222 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.335264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.335275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.335292 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.335305 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.440573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.440628 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.440641 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.440661 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.440678 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.463300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerStarted","Data":"2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.478305 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.488543 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.498428 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.512890 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.528896 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.543015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.543055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.543066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.543083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.543096 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.547941 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.558787 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.572899 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.584529 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.600613 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.616432 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.628503 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.645852 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.645890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.645924 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.645948 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.645966 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.648436 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.677203 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:35Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.748604 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.748671 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.748689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.748717 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.748736 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.851556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.851621 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.851639 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.851663 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.851679 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.954998 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.955066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.955089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.955120 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:35 crc kubenswrapper[4841]: I1203 17:00:35.955143 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:35Z","lastTransitionTime":"2025-12-03T17:00:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.057470 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.057533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.057556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.057587 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.057611 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.160859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.161242 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.161256 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.161275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.161288 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.237886 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:36 crc kubenswrapper[4841]: E1203 17:00:36.238056 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.255003 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.263531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.263567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.263579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.263597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.263609 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.274044 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.289372 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.302897 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.315511 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.329432 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.348744 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.362411 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.366666 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.366721 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.366731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.366751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.366765 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.378563 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.405189 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.421210 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.435287 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.454133 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.468420 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.468446 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.468457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.468473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.468483 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.469044 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55" exitCode=0 Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.469092 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.476677 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.477090 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.477986 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.478088 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.487354 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.501653 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.505742 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.508211 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.515864 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.535216 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.545481 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.561828 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.571188 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.571231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.571244 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.571263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.571275 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.576205 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.590593 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.604333 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.618640 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.633189 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.643640 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.657327 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.673140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.673178 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.673187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.673201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.673211 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.679193 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.692688 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.710147 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.729219 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.745018 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.758450 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.774529 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.776152 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.776181 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.776190 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.776205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.776213 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.787685 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.804771 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.815139 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.825825 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.835404 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.849192 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.859251 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.867445 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.875520 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:36Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.877844 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.877868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.877877 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.877892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.877918 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.981034 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.981083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.981095 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.981111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:36 crc kubenswrapper[4841]: I1203 17:00:36.981122 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:36Z","lastTransitionTime":"2025-12-03T17:00:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.084220 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.084286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.084312 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.084340 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.084361 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.187187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.187220 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.187232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.187247 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.187257 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.238396 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.238405 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:37 crc kubenswrapper[4841]: E1203 17:00:37.238620 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:37 crc kubenswrapper[4841]: E1203 17:00:37.238742 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.289832 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.289896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.289945 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.289971 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.289988 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.392349 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.392386 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.392415 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.392429 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.392438 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.483445 4841 generic.go:334] "Generic (PLEG): container finished" podID="46566bd5-34f0-4858-9a20-3a78c292e4ba" containerID="1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f" exitCode=0 Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.483516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerDied","Data":"1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.494700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.494737 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.494748 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.494765 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.494778 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.502300 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.523566 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.542887 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.555611 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.566052 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.583159 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597474 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597802 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597837 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.597862 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.610534 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.624378 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.638380 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.651397 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.662591 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.676515 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.690337 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:37Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.701486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.701526 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.701536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.701550 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.701560 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.804391 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.804447 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.804463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.804483 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.804496 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.906963 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.907005 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.907018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.907033 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.907045 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:37Z","lastTransitionTime":"2025-12-03T17:00:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:37 crc kubenswrapper[4841]: I1203 17:00:37.999889 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.008576 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.008609 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.008618 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.008631 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.008639 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.013889 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.028328 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.040496 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.054174 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.064582 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.078247 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.093706 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.109227 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.111118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.111148 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.111159 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.111173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.111183 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.119756 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.136052 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.148960 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.165395 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.180058 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.195927 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.213651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.213695 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.213711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.213734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.213752 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.238270 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:38 crc kubenswrapper[4841]: E1203 17:00:38.238481 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.316008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.316072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.316098 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.316123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.316141 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.419348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.419408 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.419424 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.419449 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.419466 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.490366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" event={"ID":"46566bd5-34f0-4858-9a20-3a78c292e4ba","Type":"ContainerStarted","Data":"20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.507783 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.521364 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.521421 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.521435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.521457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.521473 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.524097 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.537489 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.552324 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.567942 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.585464 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.606280 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.616333 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.623919 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.623955 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.623967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.623984 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.623995 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.628670 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.639156 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.650410 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.660164 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.672023 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.692821 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:38Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.727164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.727207 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.727219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.727237 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.727251 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.829368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.829407 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.829420 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.829435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.829450 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.932099 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.932142 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.932150 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.932165 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:38 crc kubenswrapper[4841]: I1203 17:00:38.932176 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:38Z","lastTransitionTime":"2025-12-03T17:00:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.035073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.035110 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.035119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.035131 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.035140 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.136605 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.136637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.136649 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.136664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.136675 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.149457 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.152950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.152992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.153003 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.153019 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.153030 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.175038 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.179601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.179638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.179650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.179668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.179682 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.192241 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.196232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.196276 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.196294 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.196312 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.196322 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.209802 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.214356 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.214401 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.214414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.214434 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.214446 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.227313 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.227484 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.229086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.229130 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.229149 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.229177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.229206 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.238528 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.238569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.238671 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:39 crc kubenswrapper[4841]: E1203 17:00:39.238858 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.331028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.331068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.331078 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.331095 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.331105 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.433556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.433614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.433631 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.433654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.433674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.497342 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/0.log" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.501726 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797" exitCode=1 Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.501781 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.502934 4841 scope.go:117] "RemoveContainer" containerID="120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.523708 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.536653 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.536711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.536728 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.536753 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.536770 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.542184 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.566964 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.587654 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.607998 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 17:00:39.252875 6092 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 17:00:39.252996 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:39.253022 6092 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:39.253035 6092 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 17:00:39.253040 6092 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:39.253053 6092 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:39.253057 6092 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:39.253072 6092 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 17:00:39.253082 6092 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:39.253092 6092 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:39.253096 6092 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:39.253132 6092 factory.go:656] Stopping watch factory\\\\nI1203 17:00:39.253152 6092 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:39.253164 6092 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:39.253179 6092 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.636478 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.639190 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.639219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.639230 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.639247 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.639260 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.664599 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.685043 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.707990 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.723980 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.734328 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.741972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.742028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.742043 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.742071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.742089 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.747327 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.757467 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.766394 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:39Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.845213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.845282 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.845295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.845321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.845335 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.948078 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.948131 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.948150 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.948176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:39 crc kubenswrapper[4841]: I1203 17:00:39.948195 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:39Z","lastTransitionTime":"2025-12-03T17:00:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.050451 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.050495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.050504 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.050519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.050529 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.152623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.152661 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.152683 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.152697 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.152706 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.238676 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:40 crc kubenswrapper[4841]: E1203 17:00:40.238890 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.255669 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.255782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.255796 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.255820 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.255836 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.359248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.359295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.359306 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.359329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.359339 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.462700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.462766 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.462785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.462810 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.462827 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.508062 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/0.log" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.516454 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.517060 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.539478 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.558015 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.565694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.565744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.565757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.565776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.565788 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.575018 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.587094 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.609312 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 17:00:39.252875 6092 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 17:00:39.252996 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:39.253022 6092 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:39.253035 6092 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 17:00:39.253040 6092 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:39.253053 6092 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:39.253057 6092 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:39.253072 6092 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 17:00:39.253082 6092 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:39.253092 6092 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:39.253096 6092 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:39.253132 6092 factory.go:656] Stopping watch factory\\\\nI1203 17:00:39.253152 6092 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:39.253164 6092 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:39.253179 6092 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.623378 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.642008 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.662030 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.668608 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.668677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.668699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.668728 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.668750 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.684170 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.701270 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.722177 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.738346 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.754651 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.768778 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:40Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.771494 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.771552 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.771567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.771587 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.771601 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.874260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.874323 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.874337 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.874354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.874364 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.977979 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.978063 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.978088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.978125 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:40 crc kubenswrapper[4841]: I1203 17:00:40.978147 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:40Z","lastTransitionTime":"2025-12-03T17:00:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.081404 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.081474 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.081499 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.081530 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.081556 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.184436 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.184480 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.184491 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.184508 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.184518 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.238575 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.238575 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:41 crc kubenswrapper[4841]: E1203 17:00:41.238703 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:41 crc kubenswrapper[4841]: E1203 17:00:41.238755 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.287380 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.287439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.287458 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.287484 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.287502 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.390699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.390758 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.390776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.390799 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.390816 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.494703 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.494781 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.494808 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.494838 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.494863 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.522078 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/1.log" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.522953 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/0.log" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.525927 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706" exitCode=1 Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.525971 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.526026 4841 scope.go:117] "RemoveContainer" containerID="120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.527293 4841 scope.go:117] "RemoveContainer" containerID="31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706" Dec 03 17:00:41 crc kubenswrapper[4841]: E1203 17:00:41.527578 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.544784 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.561669 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.581331 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.597101 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.597177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.597201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.597232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.597255 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.607284 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.622268 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.645593 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.682616 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 17:00:39.252875 6092 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 17:00:39.252996 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:39.253022 6092 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:39.253035 6092 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 17:00:39.253040 6092 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:39.253053 6092 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:39.253057 6092 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:39.253072 6092 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 17:00:39.253082 6092 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:39.253092 6092 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:39.253096 6092 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:39.253132 6092 factory.go:656] Stopping watch factory\\\\nI1203 17:00:39.253152 6092 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:39.253164 6092 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:39.253179 6092 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.699505 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.699550 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.699570 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.699593 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.699610 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.705979 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.726613 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.746381 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.763202 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.776233 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.791824 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.802443 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.802506 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.802517 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.802539 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.802551 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.806992 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.864203 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv"] Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.865032 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.868173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.869627 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.894464 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.905029 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.905080 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.905099 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.905118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.905130 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:41Z","lastTransitionTime":"2025-12-03T17:00:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.914504 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.928539 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.937550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l9jp\" (UniqueName: \"kubernetes.io/projected/2d1d9aac-8558-4b49-a650-435b4fb09a09-kube-api-access-9l9jp\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.937606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.937642 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.937730 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.945254 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.959033 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.981416 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://120c9f5767cc9438b66ca061567480c27d3387c0c0db26d0133dee0796aa3797\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:39Z\\\",\\\"message\\\":\\\"qos/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 17:00:39.252875 6092 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 17:00:39.252996 6092 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:39.253022 6092 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:39.253035 6092 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 17:00:39.253040 6092 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:39.253053 6092 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:39.253057 6092 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:39.253072 6092 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI1203 17:00:39.253082 6092 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:39.253092 6092 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:39.253096 6092 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:39.253132 6092 factory.go:656] Stopping watch factory\\\\nI1203 17:00:39.253152 6092 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:39.253164 6092 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:39.253179 6092 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 17\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:36Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:41 crc kubenswrapper[4841]: I1203 17:00:41.995123 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:41Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.007762 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.007804 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.007816 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.007832 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.007843 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.009240 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.022897 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.039480 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.039559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.039608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l9jp\" (UniqueName: \"kubernetes.io/projected/2d1d9aac-8558-4b49-a650-435b4fb09a09-kube-api-access-9l9jp\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.039698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.040270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.040513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d1d9aac-8558-4b49-a650-435b4fb09a09-env-overrides\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.042973 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.044892 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d1d9aac-8558-4b49-a650-435b4fb09a09-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.056073 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l9jp\" (UniqueName: \"kubernetes.io/projected/2d1d9aac-8558-4b49-a650-435b4fb09a09-kube-api-access-9l9jp\") pod \"ovnkube-control-plane-749d76644c-m2ccv\" (UID: \"2d1d9aac-8558-4b49-a650-435b4fb09a09\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.059289 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.073186 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.087155 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.099562 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.110489 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.110536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.110551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.110570 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.110586 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.111881 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.140139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.140236 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.140271 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.140300 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140385 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140385 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:00:58.140348186 +0000 UTC m=+52.527868913 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140419 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140462 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140479 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140439 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140464 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:58.140448998 +0000 UTC m=+52.527969725 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140530 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.140575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140600 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140662 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:58.140643423 +0000 UTC m=+52.528164360 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140664 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140687 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:58.140676454 +0000 UTC m=+52.528197421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.140723 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:58.140707944 +0000 UTC m=+52.528228671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.187780 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" Dec 03 17:00:42 crc kubenswrapper[4841]: W1203 17:00:42.201822 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d1d9aac_8558_4b49_a650_435b4fb09a09.slice/crio-df6f54de7513af09669f2e6e2ed09bbc1e7012ab0a4f89ab49054801cbdb7f62 WatchSource:0}: Error finding container df6f54de7513af09669f2e6e2ed09bbc1e7012ab0a4f89ab49054801cbdb7f62: Status 404 returned error can't find the container with id df6f54de7513af09669f2e6e2ed09bbc1e7012ab0a4f89ab49054801cbdb7f62 Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.213455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.213494 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.213504 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.213521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.213533 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.238846 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.238995 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.316132 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.316187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.316205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.316228 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.316245 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.419073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.419147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.419168 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.419196 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.419215 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.522531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.522573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.522583 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.522597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.522608 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.531453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" event={"ID":"2d1d9aac-8558-4b49-a650-435b4fb09a09","Type":"ContainerStarted","Data":"df6f54de7513af09669f2e6e2ed09bbc1e7012ab0a4f89ab49054801cbdb7f62"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.534414 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/1.log" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.540217 4841 scope.go:117] "RemoveContainer" containerID="31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706" Dec 03 17:00:42 crc kubenswrapper[4841]: E1203 17:00:42.540480 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.557412 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.578853 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.594004 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.620723 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.626102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.626136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.626146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.626162 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.626173 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.642558 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.671106 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.711467 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.730167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.730248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.730273 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.730306 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.730328 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.732789 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.749475 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.768576 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.788459 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.804877 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.821375 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.833496 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.833540 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.833557 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.833579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.833595 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.840271 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.864807 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:42Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.935891 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.936012 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.936042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.936076 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:42 crc kubenswrapper[4841]: I1203 17:00:42.936103 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:42Z","lastTransitionTime":"2025-12-03T17:00:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.038487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.038530 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.038543 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.038565 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.038581 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.140818 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.140845 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.140855 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.140868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.140877 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.238146 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.238202 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:43 crc kubenswrapper[4841]: E1203 17:00:43.238272 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:43 crc kubenswrapper[4841]: E1203 17:00:43.238383 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.243519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.243555 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.243566 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.243609 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.243635 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.346052 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.346091 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.346103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.346120 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.346136 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.447953 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.447989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.448000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.448016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.448026 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.544162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" event={"ID":"2d1d9aac-8558-4b49-a650-435b4fb09a09","Type":"ContainerStarted","Data":"ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.544761 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" event={"ID":"2d1d9aac-8558-4b49-a650-435b4fb09a09","Type":"ContainerStarted","Data":"0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.550321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.550386 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.550409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.550438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.550455 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.559600 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.573574 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.589495 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.603582 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.614498 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.631698 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.647196 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.652396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.652445 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.652463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.652487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.652503 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.667795 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.683778 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.696262 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.715722 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.720167 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fcw2m"] Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.721145 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: E1203 17:00:43.721253 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.727042 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.738175 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.750452 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.754896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.755000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.755022 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.755044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.755059 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.761845 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.775159 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.785988 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.797509 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.812593 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.827566 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.848333 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.857839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.857872 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.857880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.857893 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.857917 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.858327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.858379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5cpx\" (UniqueName: \"kubernetes.io/projected/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-kube-api-access-t5cpx\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.865574 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.881538 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.898412 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.916327 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.928980 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.949140 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.959197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.959282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5cpx\" (UniqueName: \"kubernetes.io/projected/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-kube-api-access-t5cpx\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: E1203 17:00:43.959445 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:43 crc kubenswrapper[4841]: E1203 17:00:43.959542 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:44.459509017 +0000 UTC m=+38.847029784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.961219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.961265 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.961282 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.961306 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.961324 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:43Z","lastTransitionTime":"2025-12-03T17:00:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.969196 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.984567 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.987999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5cpx\" (UniqueName: \"kubernetes.io/projected/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-kube-api-access-t5cpx\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:43 crc kubenswrapper[4841]: I1203 17:00:43.998640 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:43Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.018421 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:44Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.064452 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.064804 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.064950 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.065079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.065169 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.167434 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.167462 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.167472 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.167485 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.167493 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.238343 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:44 crc kubenswrapper[4841]: E1203 17:00:44.238477 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.270670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.270754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.270780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.270813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.270838 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.373567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.373644 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.373668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.373700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.373723 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.463834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:44 crc kubenswrapper[4841]: E1203 17:00:44.464151 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:44 crc kubenswrapper[4841]: E1203 17:00:44.464265 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:45.464232489 +0000 UTC m=+39.851753256 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.476163 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.476202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.476214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.476235 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.476251 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.581406 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.581468 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.581494 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.581524 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.581536 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.684190 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.684257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.684271 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.684295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.684308 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.788698 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.788778 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.788798 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.788827 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.788850 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.891636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.891684 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.891694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.891707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.891716 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.996311 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.996399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.996418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.996444 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:44 crc kubenswrapper[4841]: I1203 17:00:44.996462 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:44Z","lastTransitionTime":"2025-12-03T17:00:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.099493 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.099543 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.099554 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.099574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.099587 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.202250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.202312 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.202322 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.202343 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.202354 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.237856 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.237964 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:45 crc kubenswrapper[4841]: E1203 17:00:45.238092 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.237978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:45 crc kubenswrapper[4841]: E1203 17:00:45.238192 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:45 crc kubenswrapper[4841]: E1203 17:00:45.238320 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.305117 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.305149 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.305159 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.305173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.305182 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.408978 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.409051 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.409074 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.409104 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.409126 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.475600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:45 crc kubenswrapper[4841]: E1203 17:00:45.475766 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:45 crc kubenswrapper[4841]: E1203 17:00:45.475858 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:47.475819955 +0000 UTC m=+41.863340712 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.512325 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.512385 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.512409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.512438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.512455 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.614871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.614984 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.615009 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.615042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.615067 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.718209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.718324 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.718340 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.718359 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.718372 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.821527 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.821581 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.821594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.821614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.821629 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.924723 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.924774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.924790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.924810 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:45 crc kubenswrapper[4841]: I1203 17:00:45.924825 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:45Z","lastTransitionTime":"2025-12-03T17:00:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.028124 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.028198 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.028216 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.028235 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.028249 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.130890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.130992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.131016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.131039 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.131056 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.233651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.233707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.233724 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.233753 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.233769 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.238338 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:46 crc kubenswrapper[4841]: E1203 17:00:46.238556 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.251661 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.268890 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.297563 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.313257 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.327717 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.335706 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.335814 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.335835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.335893 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.335963 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.345505 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.360436 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.371780 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.384245 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.395012 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.405793 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.416485 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.429309 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.438475 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.438501 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.438508 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.438521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.438529 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.448325 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.466710 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.482165 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:46Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.542340 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.542630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.542843 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.543097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.543277 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.645974 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.646783 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.646979 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.647128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.647270 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.749854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.750419 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.750573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.750717 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.750860 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.854401 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.854944 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.855154 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.855340 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.855520 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.958199 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.958238 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.958248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.958263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:46 crc kubenswrapper[4841]: I1203 17:00:46.958272 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:46Z","lastTransitionTime":"2025-12-03T17:00:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.061518 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.061856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.062076 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.062287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.062456 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.166515 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.166594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.166622 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.166653 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.166674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.238510 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.238618 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.238548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:47 crc kubenswrapper[4841]: E1203 17:00:47.238752 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:47 crc kubenswrapper[4841]: E1203 17:00:47.238841 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:47 crc kubenswrapper[4841]: E1203 17:00:47.239053 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.270083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.270126 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.270136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.270151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.270173 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.372881 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.372958 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.372971 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.372990 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.373004 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.477215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.477258 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.477271 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.477286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.477298 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.497108 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:47 crc kubenswrapper[4841]: E1203 17:00:47.497279 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:47 crc kubenswrapper[4841]: E1203 17:00:47.497412 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:51.497367647 +0000 UTC m=+45.884888404 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.579991 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.580037 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.580048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.580066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.580079 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.682666 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.682729 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.682743 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.682759 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.682770 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.787774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.787814 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.787858 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.787877 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.787889 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.890437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.890502 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.890519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.890544 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.890564 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.993622 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.993688 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.993709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.993734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:47 crc kubenswrapper[4841]: I1203 17:00:47.993752 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:47Z","lastTransitionTime":"2025-12-03T17:00:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.096405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.096473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.096489 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.096506 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.096519 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.200125 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.200182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.200200 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.200223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.200241 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.238828 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:48 crc kubenswrapper[4841]: E1203 17:00:48.239136 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.303157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.303207 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.303225 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.303246 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.303260 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.406173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.406244 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.406264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.406290 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.406308 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.508275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.508362 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.508378 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.508420 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.508434 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.611208 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.611258 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.611274 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.611319 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.611339 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.713932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.713977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.713989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.714008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.714020 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.816825 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.816873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.816892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.816973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.816988 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.919983 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.920024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.920034 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.920050 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:48 crc kubenswrapper[4841]: I1203 17:00:48.920061 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:48Z","lastTransitionTime":"2025-12-03T17:00:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.022878 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.022983 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.023008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.023040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.023065 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.125748 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.125822 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.125849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.125880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.125946 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.228930 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.228969 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.228977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.228992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.229003 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.238566 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.238613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.238599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.238810 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.238956 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.239090 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.332302 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.332387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.332406 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.332431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.332449 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.435028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.435098 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.435117 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.435146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.435166 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.537808 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.538347 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.538419 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.538461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.538485 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.626897 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.627003 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.627023 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.627051 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.627075 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.646302 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:49Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.651762 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.651824 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.651848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.651878 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.651897 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.670413 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:49Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.675895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.675954 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.675968 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.675984 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.675996 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.692453 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:49Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.697125 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.697185 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.697197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.697254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.697273 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.713174 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:49Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.717833 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.717871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.717881 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.717895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.717935 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.733679 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:49Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:49 crc kubenswrapper[4841]: E1203 17:00:49.733799 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.735716 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.735748 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.735756 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.735773 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.735784 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.838811 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.839120 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.839289 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.839465 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.839606 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.942040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.942441 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.942601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.942757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:49 crc kubenswrapper[4841]: I1203 17:00:49.942933 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:49Z","lastTransitionTime":"2025-12-03T17:00:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.046734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.046820 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.046839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.047392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.047468 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.149709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.149741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.149750 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.149763 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.149772 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.238120 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:50 crc kubenswrapper[4841]: E1203 17:00:50.238285 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.251657 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.251732 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.251740 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.251754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.251764 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.354074 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.354112 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.354121 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.354137 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.354148 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.456943 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.456993 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.457008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.457032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.457049 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.559895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.559960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.559971 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.559988 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.559999 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.662161 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.662201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.662214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.662232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.662245 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.764985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.765058 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.765088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.765115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.765135 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.868379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.868433 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.868446 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.868464 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.868476 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.971568 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.971632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.971641 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.971658 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:50 crc kubenswrapper[4841]: I1203 17:00:50.971668 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:50Z","lastTransitionTime":"2025-12-03T17:00:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.075141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.075215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.075239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.075288 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.075312 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.178205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.178255 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.178267 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.178284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.178296 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.238480 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.238578 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.238500 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:51 crc kubenswrapper[4841]: E1203 17:00:51.238737 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:51 crc kubenswrapper[4841]: E1203 17:00:51.238851 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:51 crc kubenswrapper[4841]: E1203 17:00:51.239000 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.280458 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.280498 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.280507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.280522 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.280530 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.383400 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.383480 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.383498 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.383523 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.383540 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.486757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.486795 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.486806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.486850 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.486862 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.542630 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:51 crc kubenswrapper[4841]: E1203 17:00:51.542811 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:51 crc kubenswrapper[4841]: E1203 17:00:51.542882 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:00:59.542864179 +0000 UTC m=+53.930384906 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.589582 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.589644 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.589664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.589689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.589710 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.692671 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.692761 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.692779 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.692804 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.692823 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.795258 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.795321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.795338 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.795360 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.795374 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.898202 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.898275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.898299 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.898329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:51 crc kubenswrapper[4841]: I1203 17:00:51.898352 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:51Z","lastTransitionTime":"2025-12-03T17:00:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.000218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.000252 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.000259 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.000272 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.000281 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.103018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.103052 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.103061 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.103076 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.103085 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.205867 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.205981 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.206003 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.206030 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.206051 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.238374 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:52 crc kubenswrapper[4841]: E1203 17:00:52.238582 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.310448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.310538 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.310593 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.310618 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.310668 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.414344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.414419 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.414445 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.414475 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.414498 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.517073 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.517101 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.517109 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.517122 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.517130 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.618731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.618789 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.618807 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.618829 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.618848 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.722123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.722195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.722219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.722248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.722269 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.825097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.825174 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.825192 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.825219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.825237 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.927356 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.927389 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.927398 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.927410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:52 crc kubenswrapper[4841]: I1203 17:00:52.927419 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:52Z","lastTransitionTime":"2025-12-03T17:00:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.030039 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.030124 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.030151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.030184 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.030208 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.133814 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.133873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.133890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.133961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.133984 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.236834 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.236874 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.236883 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.236900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.236930 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.238072 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.238147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.238090 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:53 crc kubenswrapper[4841]: E1203 17:00:53.238193 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:53 crc kubenswrapper[4841]: E1203 17:00:53.238238 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:53 crc kubenswrapper[4841]: E1203 17:00:53.238332 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.338985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.339056 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.339076 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.339111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.339133 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.441486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.441601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.441611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.441625 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.441636 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.544709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.544751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.544760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.544776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.544790 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.647511 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.647553 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.647562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.647579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.647590 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.751028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.751097 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.751115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.751136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.751153 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.854569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.854633 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.854646 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.854668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.854684 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.961574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.961681 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.961711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.961750 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:53 crc kubenswrapper[4841]: I1203 17:00:53.961781 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:53Z","lastTransitionTime":"2025-12-03T17:00:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.065714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.065799 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.065824 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.065855 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.065878 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.169018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.169085 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.169103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.169126 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.169145 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.238694 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:54 crc kubenswrapper[4841]: E1203 17:00:54.238837 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.271518 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.271630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.271660 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.271693 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.271716 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.374764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.374866 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.374891 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.374942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.374960 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.477900 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.478015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.478055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.478119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.478159 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.580720 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.580775 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.580794 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.580820 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.580838 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.683803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.683864 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.683881 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.683943 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.683962 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.786875 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.786958 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.786978 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.787000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.787017 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.889717 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.889782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.889792 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.889809 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.889821 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.992483 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.992520 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.992529 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.992545 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:54 crc kubenswrapper[4841]: I1203 17:00:54.992553 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:54Z","lastTransitionTime":"2025-12-03T17:00:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.095551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.095626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.095650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.095682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.095708 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.198265 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.198304 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.198315 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.198327 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.198338 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.238048 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.238048 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.238055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:55 crc kubenswrapper[4841]: E1203 17:00:55.238168 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:55 crc kubenswrapper[4841]: E1203 17:00:55.238717 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:55 crc kubenswrapper[4841]: E1203 17:00:55.238797 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.239414 4841 scope.go:117] "RemoveContainer" containerID="31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.301271 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.301307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.301317 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.301330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.301340 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.405017 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.405085 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.405104 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.405129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.405170 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.507959 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.508010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.508026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.508048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.508065 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.588679 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/1.log" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.591184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.591695 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.610825 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.610860 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.610868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.610881 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.610890 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.618941 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.633915 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.660721 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.676260 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.702413 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.713350 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.713392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.713401 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.713414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.713424 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.718229 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.731517 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.743598 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.755274 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.768407 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.779273 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.788731 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.798726 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.813209 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.815168 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.815219 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.815231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.815250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.815261 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.828447 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.839666 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:55Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.917390 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.917431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.917441 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.917457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:55 crc kubenswrapper[4841]: I1203 17:00:55.917468 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:55Z","lastTransitionTime":"2025-12-03T17:00:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.019871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.019940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.019958 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.019980 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.019996 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.122228 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.122287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.122305 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.122320 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.122331 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.225484 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.225519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.225532 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.225548 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.225560 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.237785 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:56 crc kubenswrapper[4841]: E1203 17:00:56.238624 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.258453 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.278137 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.290148 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.300553 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.314318 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.332698 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.333317 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.333367 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.333384 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.333405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.333424 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.349322 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.359667 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.371397 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.381707 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.390510 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.403695 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.416387 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.435359 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.435405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.435418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.435434 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.435445 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.437122 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.447083 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.459983 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.538263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.538320 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.538330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.538345 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.538388 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.596467 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/2.log" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.597559 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/1.log" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.601400 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" exitCode=1 Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.601457 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.601551 4841 scope.go:117] "RemoveContainer" containerID="31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.603020 4841 scope.go:117] "RemoveContainer" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" Dec 03 17:00:56 crc kubenswrapper[4841]: E1203 17:00:56.603230 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.616628 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.631440 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.640625 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.640686 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.640704 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.640729 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.640745 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.650451 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://31320a697563ba4421aafbf76486a7bb50a25dd34aabe6db7e2bec5e286b8706\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"message\\\":\\\" 6244 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.336714 6244 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI1203 17:00:40.337114 6244 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 17:00:40.337147 6244 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 17:00:40.337155 6244 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 17:00:40.337166 6244 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 17:00:40.337182 6244 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 17:00:40.337188 6244 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 17:00:40.337196 6244 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 17:00:40.337220 6244 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 17:00:40.337249 6244 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 17:00:40.337253 6244 factory.go:656] Stopping watch factory\\\\nI1203 17:00:40.337272 6244 ovnkube.go:599] Stopped ovnkube\\\\nI1203 17:00:40.337271 6244 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 17:00:40.337281 6244 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI12\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.663274 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.674247 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.681997 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.694062 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.706955 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.718097 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.729506 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.741159 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.742757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.742803 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.742814 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.742831 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.742842 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.750965 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.759876 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.773495 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.786530 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.799164 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:56Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.844974 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.845021 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.845032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.845048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.845060 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.947324 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.947386 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.947403 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.947428 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:56 crc kubenswrapper[4841]: I1203 17:00:56.947447 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:56Z","lastTransitionTime":"2025-12-03T17:00:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.049583 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.049635 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.049648 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.049665 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.049674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.153072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.153150 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.153183 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.153214 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.153239 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.238476 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.238516 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.238640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:57 crc kubenswrapper[4841]: E1203 17:00:57.238762 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:57 crc kubenswrapper[4841]: E1203 17:00:57.238934 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:57 crc kubenswrapper[4841]: E1203 17:00:57.239112 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.256102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.256199 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.256226 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.256251 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.256268 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.358993 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.359044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.359061 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.359086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.359103 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.461348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.461400 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.461414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.461437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.461453 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.563985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.564069 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.564093 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.564126 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.564149 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.608045 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/2.log" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.611531 4841 scope.go:117] "RemoveContainer" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" Dec 03 17:00:57 crc kubenswrapper[4841]: E1203 17:00:57.611692 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.626413 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.639061 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.651758 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666612 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666620 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666644 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.666988 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.683553 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.697487 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.713773 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.732471 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.762880 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.768940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.768988 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.768996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.769012 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.769024 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.776531 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.793840 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.811069 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.825937 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.840225 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.852031 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.862892 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:00:57Z is after 2025-08-24T17:21:41Z" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.871847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.871886 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.871901 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.871941 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.871954 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.974567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.974614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.974625 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.974641 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:57 crc kubenswrapper[4841]: I1203 17:00:57.974653 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:57Z","lastTransitionTime":"2025-12-03T17:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.077464 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.077503 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.077515 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.077533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.077546 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.180191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.180234 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.180246 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.180264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.180276 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.219195 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.219306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.219355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219400 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:01:30.219373799 +0000 UTC m=+84.606894516 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.219443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219487 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219502 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219529 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219548 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:30.219533253 +0000 UTC m=+84.607053990 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219548 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219557 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219586 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:30.219580154 +0000 UTC m=+84.607100871 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.219490 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219676 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:30.219655556 +0000 UTC m=+84.607176323 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219790 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219811 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219823 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.219938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:30.219864171 +0000 UTC m=+84.607384998 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.238363 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:00:58 crc kubenswrapper[4841]: E1203 17:00:58.238535 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.282236 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.282278 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.282289 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.282307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.282319 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.384615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.385018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.385260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.385497 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.385711 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.489051 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.489286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.489354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.489447 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.489518 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.592634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.592691 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.592708 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.592732 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.592747 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.695032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.695089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.695106 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.695128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.695146 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.797631 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.797672 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.797682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.797696 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.797705 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.901102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.901169 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.901192 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.901237 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:58 crc kubenswrapper[4841]: I1203 17:00:58.901455 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:58Z","lastTransitionTime":"2025-12-03T17:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.003331 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.003368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.003378 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.003392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.003400 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.107446 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.107514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.107535 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.107564 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.107585 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.210379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.210448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.210460 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.210476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.210488 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.238628 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.238640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:59 crc kubenswrapper[4841]: E1203 17:00:59.238797 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.238641 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:00:59 crc kubenswrapper[4841]: E1203 17:00:59.238928 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:00:59 crc kubenswrapper[4841]: E1203 17:00:59.239003 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.313559 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.313615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.313631 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.313652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.313667 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.415771 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.415849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.415865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.415942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.415961 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.519406 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.519506 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.519531 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.519563 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.519587 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.622218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.622297 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.622317 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.622342 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.622365 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.634659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:00:59 crc kubenswrapper[4841]: E1203 17:00:59.634854 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:59 crc kubenswrapper[4841]: E1203 17:00:59.635006 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:15.634973849 +0000 UTC m=+70.022494616 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.725164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.725227 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.725238 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.725257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.725269 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.829379 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.829432 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.829450 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.829475 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.829493 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.932287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.932354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.932372 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.932399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:00:59 crc kubenswrapper[4841]: I1203 17:00:59.932416 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:00:59Z","lastTransitionTime":"2025-12-03T17:00:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.027584 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.027629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.027642 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.027685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.027700 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.042732 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.047888 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.047972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.047990 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.048016 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.048033 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.068246 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.073144 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.073198 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.073208 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.073225 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.073241 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.090666 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.094989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.095021 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.095029 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.095044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.095055 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.112750 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.116938 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.116977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.116989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.117008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.117021 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.134806 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.134947 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.136263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.136316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.136329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.136348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.136361 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.238444 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:00 crc kubenswrapper[4841]: E1203 17:01:00.238643 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.239845 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.239925 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.239945 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.239973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.239990 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.343453 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.343507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.343524 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.343547 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.343563 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.374988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.386415 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.399099 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.416111 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.432294 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.445623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.445694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.445713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.445740 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.445760 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.448150 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.464644 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.486931 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.502381 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.520305 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.536448 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.547863 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.547934 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.547945 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.547961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.547973 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.552359 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.566736 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.581217 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.594229 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.609218 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.623228 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.635988 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:00Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.650062 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.650147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.650158 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.650182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.650196 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.752971 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.753038 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.753058 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.753118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.753138 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.856042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.856102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.856119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.856141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.856156 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.958399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.958448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.958459 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.958477 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:00 crc kubenswrapper[4841]: I1203 17:01:00.958487 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:00Z","lastTransitionTime":"2025-12-03T17:01:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.060924 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.061033 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.061048 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.061064 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.061076 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.165174 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.165270 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.165297 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.165332 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.165355 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.248417 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.248475 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.248441 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:01 crc kubenswrapper[4841]: E1203 17:01:01.248677 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:01 crc kubenswrapper[4841]: E1203 17:01:01.248802 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:01 crc kubenswrapper[4841]: E1203 17:01:01.248999 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.268682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.268757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.268779 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.268804 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.268823 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.371866 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.371946 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.371966 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.371990 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.372009 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.474854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.474932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.474952 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.474975 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.474993 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.577978 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.578044 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.578055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.578074 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.578085 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.680551 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.680601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.680614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.680632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.680647 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.787194 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.787256 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.787276 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.787318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.787339 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.890558 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.890655 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.890674 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.890699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.890737 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.993712 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.993787 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.993806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.993830 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:01 crc kubenswrapper[4841]: I1203 17:01:01.993847 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:01Z","lastTransitionTime":"2025-12-03T17:01:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.096668 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.096737 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.096759 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.096789 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.096808 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.200160 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.200233 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.200255 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.200284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.200304 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.237899 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:02 crc kubenswrapper[4841]: E1203 17:01:02.238171 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.302636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.302689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.302712 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.302741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.302762 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.406387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.406785 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.407009 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.407225 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.407386 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.510198 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.510496 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.510638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.510776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.510941 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.614932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.614997 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.615015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.615037 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.615051 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.717972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.718034 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.718051 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.718075 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.718091 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.820956 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.821012 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.821023 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.821042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.821053 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.923734 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.924178 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.924339 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.924491 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:02 crc kubenswrapper[4841]: I1203 17:01:02.924674 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:02Z","lastTransitionTime":"2025-12-03T17:01:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.027806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.027875 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.027887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.027934 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.027949 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.130637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.130698 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.130713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.130737 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.130751 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.233812 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.234008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.234028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.234054 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.234072 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.238319 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:03 crc kubenswrapper[4841]: E1203 17:01:03.238479 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.238742 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.238756 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:03 crc kubenswrapper[4841]: E1203 17:01:03.239061 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:03 crc kubenswrapper[4841]: E1203 17:01:03.239226 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.337321 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.337358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.337371 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.337387 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.337398 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.439726 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.439759 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.439770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.439790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.439803 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.542941 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.542997 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.543010 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.543028 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.543041 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.645573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.645618 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.645629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.645645 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.645656 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.748323 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.748362 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.748370 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.748383 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.748392 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.850078 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.850115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.850123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.850137 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.850146 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.955218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.955254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.955262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.955275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:03 crc kubenswrapper[4841]: I1203 17:01:03.955284 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:03Z","lastTransitionTime":"2025-12-03T17:01:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.057993 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.058053 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.058079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.058111 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.058133 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.160956 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.162047 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.162090 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.162118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.162133 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.238255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:04 crc kubenswrapper[4841]: E1203 17:01:04.238485 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.264125 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.265222 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.265429 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.265843 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.266377 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.370167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.370227 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.370240 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.370263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.370277 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.473455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.473766 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.473967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.474115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.474232 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.577653 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.577711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.577735 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.577801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.577825 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.680502 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.680560 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.680574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.680592 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.680606 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.783896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.783986 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.783996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.784015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.784027 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.887328 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.887363 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.887372 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.887388 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.887399 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.989736 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.989765 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.989774 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.989798 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:04 crc kubenswrapper[4841]: I1203 17:01:04.989808 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:04Z","lastTransitionTime":"2025-12-03T17:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.092442 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.092487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.092496 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.092514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.092524 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.196888 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.196999 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.197023 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.197055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.197078 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.238810 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.238876 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.238835 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:05 crc kubenswrapper[4841]: E1203 17:01:05.239013 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:05 crc kubenswrapper[4841]: E1203 17:01:05.239223 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:05 crc kubenswrapper[4841]: E1203 17:01:05.239349 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.300084 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.300128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.300140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.300159 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.300172 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.402558 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.402601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.402612 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.402629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.402640 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.504837 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.504940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.504960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.504985 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.505006 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.607666 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.607708 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.607722 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.607739 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.607751 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.710630 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.710672 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.710682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.710697 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.710709 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.813760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.813824 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.813838 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.813856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.813867 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.916547 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.916611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.916625 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.916647 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:05 crc kubenswrapper[4841]: I1203 17:01:05.916663 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:05Z","lastTransitionTime":"2025-12-03T17:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.020114 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.020170 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.020195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.020215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.020227 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.122960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.123009 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.123020 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.123036 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.123046 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.225767 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.226286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.226359 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.226431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.226819 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.237864 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:06 crc kubenswrapper[4841]: E1203 17:01:06.238026 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.254441 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.265035 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.304258 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.325633 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.329446 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.329476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.329484 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.329497 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.329508 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.343385 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.353542 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.363112 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.375821 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.388507 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.414425 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.428873 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.431372 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.431405 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.431413 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.431428 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.431438 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.444241 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.458224 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.472717 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.489616 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.501986 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.520687 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:06Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.533636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.533688 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.533707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.533730 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.533747 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.636658 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.636713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.636729 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.636754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.636771 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.740080 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.740118 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.740127 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.740141 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.740151 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.843131 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.843204 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.843229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.843257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.843280 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.945684 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.945801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.945824 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.945852 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:06 crc kubenswrapper[4841]: I1203 17:01:06.945874 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:06Z","lastTransitionTime":"2025-12-03T17:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.049574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.049638 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.049656 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.049684 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.049708 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.152557 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.152592 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.152600 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.152614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.152622 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.238674 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.238729 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:07 crc kubenswrapper[4841]: E1203 17:01:07.238852 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.238873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:07 crc kubenswrapper[4841]: E1203 17:01:07.239085 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:07 crc kubenswrapper[4841]: E1203 17:01:07.239190 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.254399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.254435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.254447 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.254463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.254475 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.356651 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.356725 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.356749 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.356782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.356804 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.459678 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.459731 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.459748 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.459771 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.459789 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.563418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.563491 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.563508 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.563535 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.563594 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.666773 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.666846 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.666868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.666896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.666947 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.770362 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.770412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.770423 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.770442 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.770454 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.874115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.874181 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.874198 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.874222 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.874240 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.977024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.977158 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.977181 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.977233 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:07 crc kubenswrapper[4841]: I1203 17:01:07.977258 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:07Z","lastTransitionTime":"2025-12-03T17:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.080533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.080600 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.080619 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.080644 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.080662 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.183344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.183399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.183418 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.183442 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.183458 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.242073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:08 crc kubenswrapper[4841]: E1203 17:01:08.242251 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.285591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.285752 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.285776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.285799 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.285817 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.389318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.389394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.389412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.389438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.389456 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.492483 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.492594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.492611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.492634 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.492650 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.597295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.597373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.597392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.597443 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.597460 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.700126 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.700191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.700216 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.700241 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.700259 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.802972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.803014 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.803023 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.803037 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.803052 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.906543 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.906607 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.906626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.906650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:08 crc kubenswrapper[4841]: I1203 17:01:08.906668 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:08Z","lastTransitionTime":"2025-12-03T17:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.010316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.010385 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.010408 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.010437 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.010464 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.114554 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.114701 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.114725 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.114754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.114776 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.217163 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.217218 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.217231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.217248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.217259 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.238230 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.238296 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:09 crc kubenswrapper[4841]: E1203 17:01:09.238373 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.238315 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:09 crc kubenswrapper[4841]: E1203 17:01:09.238483 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:09 crc kubenswrapper[4841]: E1203 17:01:09.239052 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.319986 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.320046 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.320066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.320090 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.320169 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.422167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.422217 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.422229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.422247 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.422259 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.525977 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.526013 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.526032 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.526049 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.526062 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.629167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.629435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.629614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.629918 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.630024 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.732433 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.732514 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.732542 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.732575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.732599 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.835096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.835134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.835147 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.835162 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.835173 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.938260 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.938307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.938318 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.938336 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:09 crc kubenswrapper[4841]: I1203 17:01:09.938347 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:09Z","lastTransitionTime":"2025-12-03T17:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.041284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.041336 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.041352 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.041374 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.041393 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.143967 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.144025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.144040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.144077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.144099 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.238258 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.238408 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.246298 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.246323 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.246331 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.246343 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.246352 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.320871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.320922 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.320932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.320949 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.320961 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.332415 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:10Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.337584 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.337647 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.337659 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.337678 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.337707 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.349565 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:10Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.354232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.354296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.354308 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.354325 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.354358 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.367636 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:10Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.371961 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.372091 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.372628 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.372848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.373014 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.385716 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:10Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.389544 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.389582 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.389594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.389608 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.389618 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.406545 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:10Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:10 crc kubenswrapper[4841]: E1203 17:01:10.406662 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.408232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.408259 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.408269 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.408299 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.408308 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.510679 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.510715 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.510726 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.510744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.510756 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.612664 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.612975 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.613068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.613148 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.613232 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.715325 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.715575 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.715665 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.715755 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.715842 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.818932 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.818965 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.818974 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.818988 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.818998 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.921429 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.921868 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.922140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.922335 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:10 crc kubenswrapper[4841]: I1203 17:01:10.922515 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:10Z","lastTransitionTime":"2025-12-03T17:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.024840 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.024894 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.024920 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.024937 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.024947 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.127517 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.127567 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.127577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.127591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.127599 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.230055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.230095 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.230107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.230123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.230134 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.238015 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:11 crc kubenswrapper[4841]: E1203 17:01:11.238265 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.238351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:11 crc kubenswrapper[4841]: E1203 17:01:11.238560 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.238351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:11 crc kubenswrapper[4841]: E1203 17:01:11.238836 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.333396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.333470 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.333488 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.333511 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.333531 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.437042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.437128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.437155 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.437187 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.437214 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.539819 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.539856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.539869 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.539884 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.539949 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.642264 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.642295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.642303 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.642315 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.642323 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.744605 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.744643 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.744654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.744670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.744681 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.847757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.847813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.847827 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.847845 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.847857 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.950849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.950930 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.950942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.950963 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:11 crc kubenswrapper[4841]: I1203 17:01:11.950975 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:11Z","lastTransitionTime":"2025-12-03T17:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.054197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.054330 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.054348 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.054373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.054385 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.156863 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.156973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.156992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.157017 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.157034 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.238466 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:12 crc kubenswrapper[4841]: E1203 17:01:12.238826 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.239168 4841 scope.go:117] "RemoveContainer" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" Dec 03 17:01:12 crc kubenswrapper[4841]: E1203 17:01:12.239341 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.258780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.258813 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.258822 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.258835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.258848 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.361541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.361588 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.361600 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.361615 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.361625 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.464025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.464063 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.464072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.464086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.464096 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.566463 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.566501 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.566509 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.566523 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.566532 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.668396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.668462 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.668475 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.668491 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.668503 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.771317 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.771359 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.771368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.771384 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.771394 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.873747 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.873799 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.873808 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.873823 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.873832 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.977284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.977332 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.977341 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.977358 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:12 crc kubenswrapper[4841]: I1203 17:01:12.977371 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:12Z","lastTransitionTime":"2025-12-03T17:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.080066 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.080136 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.080155 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.080183 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.080202 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.182589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.182676 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.182703 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.182729 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.182744 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.238520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.238520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:13 crc kubenswrapper[4841]: E1203 17:01:13.238655 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:13 crc kubenswrapper[4841]: E1203 17:01:13.238783 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.238929 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:13 crc kubenswrapper[4841]: E1203 17:01:13.239076 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.286024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.286068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.286083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.286102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.286117 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.388769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.390205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.390409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.390512 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.390602 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.496243 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.496310 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.496328 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.496351 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.496373 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.599083 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.599148 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.599169 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.599197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.599217 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.701776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.701827 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.701836 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.701851 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.701863 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.804173 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.804234 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.804257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.804285 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.804306 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.909884 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.909941 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.909952 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.909968 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:13 crc kubenswrapper[4841]: I1203 17:01:13.909979 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:13Z","lastTransitionTime":"2025-12-03T17:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.012410 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.012458 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.012490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.012510 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.012523 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.114609 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.114637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.114648 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.114662 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.114671 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.217209 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.217473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.217565 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.217694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.217787 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.238565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:14 crc kubenswrapper[4841]: E1203 17:01:14.238694 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.320473 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.320828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.320979 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.321078 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.321160 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.423674 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.423705 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.423715 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.423728 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.423737 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.525895 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.525969 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.525982 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.525999 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.526011 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.628885 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.628941 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.628951 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.628964 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.628973 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.731180 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.731232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.731250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.731273 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.731290 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.833303 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.833342 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.833353 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.833368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.833379 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.935329 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.935606 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.935672 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.935728 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:14 crc kubenswrapper[4841]: I1203 17:01:14.935798 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:14Z","lastTransitionTime":"2025-12-03T17:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.037537 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.037764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.037859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.037980 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.038048 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.140392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.140770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.141000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.141217 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.141376 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.237739 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.237797 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:15 crc kubenswrapper[4841]: E1203 17:01:15.237946 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:15 crc kubenswrapper[4841]: E1203 17:01:15.238041 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.238329 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:15 crc kubenswrapper[4841]: E1203 17:01:15.238562 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.243840 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.244071 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.244208 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.244347 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.244499 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.347352 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.347628 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.347713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.347827 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.348045 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.451025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.451086 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.451103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.451130 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.451146 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.554520 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.554569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.554584 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.554603 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.554617 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.657633 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.657699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.657718 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.657742 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.657757 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.722084 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:15 crc kubenswrapper[4841]: E1203 17:01:15.722357 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:01:15 crc kubenswrapper[4841]: E1203 17:01:15.722469 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:01:47.722441611 +0000 UTC m=+102.109962408 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.760307 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.760558 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.760639 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.760755 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.760856 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.863979 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.864292 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.864378 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.864462 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.864544 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.967782 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.967833 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.967845 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.967865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:15 crc kubenswrapper[4841]: I1203 17:01:15.967877 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:15Z","lastTransitionTime":"2025-12-03T17:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.070146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.070392 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.070469 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.070533 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.070594 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.172833 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.172883 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.172896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.172928 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.172939 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.238744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:16 crc kubenswrapper[4841]: E1203 17:01:16.238873 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.254508 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.269451 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.276176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.276223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.276232 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.276250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.276262 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.284268 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.300152 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.313369 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.333265 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.351688 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.367986 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.378620 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.378672 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.378685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.378701 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.378712 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.381248 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.393455 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.408287 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.421487 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.433870 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.445525 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.463172 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.478461 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.481089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.481145 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.481160 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.481177 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.481188 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.490367 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.582859 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.582897 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.582924 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.582940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.582949 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.671551 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/0.log" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.671610 4841 generic.go:334] "Generic (PLEG): container finished" podID="0752d936-15ef-4e17-8463-3185a4c1863b" containerID="4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7" exitCode=1 Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.671671 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerDied","Data":"4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.672235 4841 scope.go:117] "RemoveContainer" containerID="4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.685055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.685082 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.685089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.685102 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.685112 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.691337 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.712428 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.726538 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.741758 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.757147 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.769163 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.782557 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.786856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.786887 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.786898 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.787134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.787147 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.795467 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.810665 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.823617 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.837091 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.848618 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.859446 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.869457 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.880631 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.889784 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.889819 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.889828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.889841 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.889851 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.893286 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.904448 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:16Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.992808 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.992857 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.992871 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.992892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:16 crc kubenswrapper[4841]: I1203 17:01:16.992922 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:16Z","lastTransitionTime":"2025-12-03T17:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.096205 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.096239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.096248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.096262 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.096272 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.198965 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.198998 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.199009 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.199024 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.199036 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.237784 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.237895 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:17 crc kubenswrapper[4841]: E1203 17:01:17.238003 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:17 crc kubenswrapper[4841]: E1203 17:01:17.238089 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.238422 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:17 crc kubenswrapper[4841]: E1203 17:01:17.238566 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.301365 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.301423 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.301440 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.301460 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.301474 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.404790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.404839 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.404849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.404865 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.404885 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.507661 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.507704 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.507712 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.507727 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.507737 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.610650 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.610707 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.610724 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.610746 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.610765 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.678100 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/0.log" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.678215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerStarted","Data":"f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.693033 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.710084 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.714454 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.714555 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.714581 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.714639 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.714760 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.723129 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.735510 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.753941 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.769778 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.784799 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.804352 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.815286 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.817394 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.817455 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.817478 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.817507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.817529 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.830247 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.842888 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.855157 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.866117 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.876188 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.887053 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.898476 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.911597 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:17Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.920015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.920042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.920053 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.920068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:17 crc kubenswrapper[4841]: I1203 17:01:17.920082 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:17Z","lastTransitionTime":"2025-12-03T17:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.022842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.022880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.022888 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.022914 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.022925 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.125390 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.125425 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.125435 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.125448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.125457 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.228856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.228896 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.228920 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.228935 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.228945 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.238352 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:18 crc kubenswrapper[4841]: E1203 17:01:18.238580 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.331018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.331139 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.331176 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.331199 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.331217 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.434146 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.434182 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.434195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.434211 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.434221 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.537419 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.537454 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.537465 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.537481 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.537492 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.641229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.641312 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.641337 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.641364 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.641386 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.744948 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.744992 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.745002 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.745018 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.745030 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.847826 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.847949 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.847979 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.848012 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.848036 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.950706 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.950741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.950752 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.950766 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:18 crc kubenswrapper[4841]: I1203 17:01:18.950776 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:18Z","lastTransitionTime":"2025-12-03T17:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.053525 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.053572 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.053583 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.053597 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.053605 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.156811 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.156860 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.156873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.156892 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.156926 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.238331 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.238433 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.238443 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:19 crc kubenswrapper[4841]: E1203 17:01:19.238591 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:19 crc kubenswrapper[4841]: E1203 17:01:19.238825 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:19 crc kubenswrapper[4841]: E1203 17:01:19.238923 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.259682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.259765 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.259828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.259856 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.259878 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.361748 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.361784 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.361793 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.361807 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.361817 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.464699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.464764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.464780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.464802 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.464816 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.568640 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.568713 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.568732 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.568760 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.568780 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.672505 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.672574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.672595 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.672626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.672643 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.781004 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.781072 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.781094 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.781133 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.781154 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.884228 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.884276 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.884285 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.884301 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.884311 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.986942 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.986987 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.986996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.987011 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:19 crc kubenswrapper[4841]: I1203 17:01:19.987021 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:19Z","lastTransitionTime":"2025-12-03T17:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.090047 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.090115 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.090132 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.090157 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.090179 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.193316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.193621 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.193694 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.193772 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.193881 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.238226 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.238630 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.296206 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.296255 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.296270 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.296287 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.296300 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.399079 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.399284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.399367 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.399430 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.399517 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.502611 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.502699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.502747 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.502775 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.502791 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.604989 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.605453 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.605549 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.605613 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.605666 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.708441 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.708751 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.708848 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.708962 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.709031 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.746532 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.746581 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.746594 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.746614 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.746626 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.765432 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:20Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.769432 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.769495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.769521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.769546 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.769565 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.786464 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:20Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.789763 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.789806 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.789816 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.789837 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.789850 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.805553 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:20Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.811211 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.811238 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.811248 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.811263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.811273 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.826118 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:20Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.830132 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.830164 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.830179 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.830197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.830210 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.842458 4841 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"d8830a74-3409-4e59-a7ee-2c2a0b4959ce\\\",\\\"systemUUID\\\":\\\"07a56beb-ca95-4540-9e44-8534d93c2a77\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:20Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:20 crc kubenswrapper[4841]: E1203 17:01:20.842570 4841 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.844316 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.844366 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.844381 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.844398 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.844410 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.947201 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.947231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.947240 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.947253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:20 crc kubenswrapper[4841]: I1203 17:01:20.947264 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:20Z","lastTransitionTime":"2025-12-03T17:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.049629 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.049673 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.049685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.049701 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.049713 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.152721 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.152780 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.152800 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.152829 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.152849 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.238578 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.238599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.238675 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:21 crc kubenswrapper[4841]: E1203 17:01:21.238692 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:21 crc kubenswrapper[4841]: E1203 17:01:21.238811 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:21 crc kubenswrapper[4841]: E1203 17:01:21.238870 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.255996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.256105 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.256128 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.256152 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.256170 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.358995 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.359054 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.359074 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.359134 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.359156 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.462286 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.462344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.462357 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.462380 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.462395 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.565084 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.565215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.565229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.565247 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.565264 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.667940 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.668026 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.668040 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.668065 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.668080 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.771406 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.771468 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.771486 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.771515 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.771534 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.874138 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.874213 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.874239 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.874270 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.874292 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.977509 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.977608 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.977641 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.977670 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:21 crc kubenswrapper[4841]: I1203 17:01:21.977688 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:21Z","lastTransitionTime":"2025-12-03T17:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.080692 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.080757 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.080770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.080790 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.080804 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.183739 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.183801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.183820 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.183888 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.183948 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.238607 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:22 crc kubenswrapper[4841]: E1203 17:01:22.238811 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.286689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.286764 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.286788 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.286819 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.286840 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.390549 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.390632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.390652 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.390677 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.390694 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.493673 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.493730 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.493746 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.493769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.493785 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.595537 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.595579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.595590 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.595607 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.595620 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.698438 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.699058 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.699090 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.699113 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.699125 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.802003 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.802052 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.802068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.802088 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.802105 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.905582 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.905659 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.905685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.905715 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:22 crc kubenswrapper[4841]: I1203 17:01:22.905737 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:22Z","lastTransitionTime":"2025-12-03T17:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.008353 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.008389 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.008399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.008415 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.008430 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.111499 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.111603 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.111626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.111661 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.111689 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.214199 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.214267 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.214281 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.214296 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.214308 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.237867 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.237930 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.237890 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:23 crc kubenswrapper[4841]: E1203 17:01:23.238081 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:23 crc kubenswrapper[4841]: E1203 17:01:23.238235 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:23 crc kubenswrapper[4841]: E1203 17:01:23.238373 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.317217 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.317263 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.317275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.317292 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.317308 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.419215 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.419273 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.419289 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.419311 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.419331 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.521835 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.522220 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.522396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.522571 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.522768 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.625675 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.626060 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.626200 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.626315 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.626428 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.729250 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.729284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.729295 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.729311 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.729322 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.832718 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.832776 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.832793 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.832816 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.832834 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.935623 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.935687 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.935709 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.935736 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:23 crc kubenswrapper[4841]: I1203 17:01:23.935758 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:23Z","lastTransitionTime":"2025-12-03T17:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.039106 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.039168 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.039185 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.039207 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.039225 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.142767 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.142832 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.142854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.142883 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.142936 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.238680 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:24 crc kubenswrapper[4841]: E1203 17:01:24.239220 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.240257 4841 scope.go:117] "RemoveContainer" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.245361 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.245396 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.245412 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.245433 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.245451 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.348921 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.348960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.348969 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.348984 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.348995 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.452285 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.452344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.452361 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.452388 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.452406 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.555744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.555822 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.555842 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.555873 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.555893 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.658498 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.658540 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.658550 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.658568 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.658579 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.705540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/2.log" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.708033 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.708814 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.721252 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.734639 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.749867 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.761226 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.761275 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.761288 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.761305 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.761316 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.776515 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.795026 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.809499 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.827688 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.838611 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.851811 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.863644 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.863854 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.863960 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.864045 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.864114 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.864202 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.874715 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.884658 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.893953 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.902961 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.914522 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.924388 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.933637 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:24Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.966145 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.966191 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.966212 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.966231 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:24 crc kubenswrapper[4841]: I1203 17:01:24.966244 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:24Z","lastTransitionTime":"2025-12-03T17:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.068420 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.068461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.068476 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.068493 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.068506 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.170710 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.170744 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.170756 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.170771 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.170781 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.238119 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.238179 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:25 crc kubenswrapper[4841]: E1203 17:01:25.238238 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.238131 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:25 crc kubenswrapper[4841]: E1203 17:01:25.238326 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:25 crc kubenswrapper[4841]: E1203 17:01:25.238407 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.276439 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.276484 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.276497 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.276523 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.276535 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.378952 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.379300 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.379490 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.379738 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.379985 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.482632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.482692 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.482704 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.482723 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.482735 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.586688 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.586775 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.586787 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.586801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.586810 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.689223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.689448 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.689457 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.689471 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.689480 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.712829 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/3.log" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.713432 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/2.log" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.715847 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" exitCode=1 Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.715913 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.715988 4841 scope.go:117] "RemoveContainer" containerID="76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.716596 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:01:25 crc kubenswrapper[4841]: E1203 17:01:25.716775 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.730980 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.745189 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.756810 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.767199 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.777454 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.789606 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.791357 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.791402 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.791414 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.791431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.791445 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.810661 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:25Z\\\",\\\"message\\\":\\\"vnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1203 17:01:25.197657 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197666 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197670 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197674 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197681 6834 ovn.go:134] Ensuring zone local for\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.821859 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.837853 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.853089 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.868007 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.881658 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.892752 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.894107 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.894140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.894151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.894167 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.894180 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.907459 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.919025 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.929970 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.941295 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.996521 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.996562 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.996573 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.996586 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:25 crc kubenswrapper[4841]: I1203 17:01:25.996595 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:25Z","lastTransitionTime":"2025-12-03T17:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.099540 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.099579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.099591 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.099607 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.099621 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.201626 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.201675 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.201685 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.201699 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.201709 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.238734 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:26 crc kubenswrapper[4841]: E1203 17:01:26.239049 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.250601 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.267642 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.282634 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.296476 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.304253 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.304461 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.304559 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.304654 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.304744 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.313576 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.340305 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76aa636322dd56cc5c8de3f3bc65b9ac197f054df77641d8f45ea5485503a0d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:00:56Z\\\",\\\"message\\\":\\\"o:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"7f9b8f25-db1a-4d02-a423-9afc5c2fb83c\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-storage-version-migrator-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.36\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF1203 17:00:56.061722 6467 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:25Z\\\",\\\"message\\\":\\\"vnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1203 17:01:25.197657 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197666 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197670 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197674 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197681 6834 ovn.go:134] Ensuring zone local for\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:01:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.353443 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.367328 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.380763 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.396202 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.409489 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.409541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.409556 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.409577 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.409594 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.415277 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.429636 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.452046 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.466567 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.478493 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.491346 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.503749 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.512516 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.512557 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.512569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.512586 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.512598 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.615284 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.615346 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.615409 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.615440 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.615462 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.718766 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.718801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.718811 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.718826 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.718836 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.722288 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/3.log" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.726447 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:01:26 crc kubenswrapper[4841]: E1203 17:01:26.726822 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.737018 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-86vxf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e7c0c3ed-ce7b-49df-b7b3-13cfe6898bd6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96997e285d30a47a550f1d7cf4860ae2b5d3c797af05e9d4d633ba700453d0a3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vn4dk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-86vxf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.752135 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-qpptb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"46566bd5-34f0-4858-9a20-3a78c292e4ba\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://20a33c927176705cbd6936c214b17808394173d8681f84943b025b93c0a3e4ea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c95471d3a07c20ceea51e3acfffc40a64330094483c344f69400d1af15b9abe7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9a00e817194fb2b6b47e320357caf232bba80444efadca0305d49d75f5e2a0be\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d64d64c533aa89ac183c6f623241f36efbdf18b089295773bf1a9910ccdbaf08\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f3f35ea8e09cbfe705e29911d6955f37930caaa19e214e355dfde646ec4174f6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2292adec3ded11518f850b1e271bbe191f53b0767324706a6902bcdc1b43ad55\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:36Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1f8a55a8ae3223b7f3a010480077a7e8ab1adc1f1566228c0a4dc13ea6de582f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-b2mz6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-qpptb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.771097 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"827c7463-d83d-4f7a-8e08-e094bfdb24b6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5923694c73c1fa52bc58648a5c9b4ab75e6075e42451bd5edb4fd0dd1022c5bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://72ef61f15d797f26d41d0454d6c4c27039e9a4808f433b22162806cd31f47af4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://162c6ed473ed9e69eea2a2bdff2a7ba2f67b711c426367a2fc9527b99aa51117\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.790490 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://92053cfaae1894647d214f1da5b901661281c320433ebdd8c8220e4b4f138fab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.803769 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.818418 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.820786 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.820973 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.821092 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.821197 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.821289 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.832048 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.844999 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4030b1cc190e303b154d5683a1e40bde219fa01fcc77675500f646d8fefbe764\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8dsfc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-c9kmk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.855677 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-bglx8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"dac136e3-68d3-410c-b010-a7509fffb25c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ffc1b12a01499fb22ba94a3098129078c7224a5203ef8409bbad36a652824683\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kvdx9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:31Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-bglx8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.867140 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-t5cpx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:43Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-fcw2m\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.880225 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T17:00:24Z\\\",\\\"message\\\":\\\"shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file\\\\nI1203 17:00:24.663212 1 configmap_cafile_content.go:205] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\\\\"\\\\nI1203 17:00:24.663219 1 shared_informer.go:313] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file\\\\nI1203 17:00:24.663287 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"WatchListClient\\\\\\\" enabled=false\\\\nI1203 17:00:24.663331 1 envvar.go:172] \\\\\\\"Feature gate default state\\\\\\\" feature=\\\\\\\"InformerResourceVersion\\\\\\\" enabled=false\\\\nI1203 17:00:24.663427 1 tlsconfig.go:203] \\\\\\\"Loaded serving cert\\\\\\\" certName=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"localhost\\\\\\\\\\\\\\\" [serving] validServingFor=[localhost] issuer=\\\\\\\\\\\\\\\"check-endpoints-signer@1764781208\\\\\\\\\\\\\\\" (2025-12-03 17:00:07 +0000 UTC to 2026-01-02 17:00:08 +0000 UTC (now=2025-12-03 17:00:24.663400819 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665480 1 named_certificates.go:53] \\\\\\\"Loaded SNI cert\\\\\\\" index=0 certName=\\\\\\\"self-signed loopback\\\\\\\" certDetail=\\\\\\\"\\\\\\\\\\\\\\\"apiserver-loopback-client@1764781219\\\\\\\\\\\\\\\" [serving] validServingFor=[apiserver-loopback-client] issuer=\\\\\\\\\\\\\\\"apiserver-loopback-client-ca@1764781219\\\\\\\\\\\\\\\" (2025-12-03 16:00:18 +0000 UTC to 2026-12-03 16:00:18 +0000 UTC (now=2025-12-03 17:00:24.665413916 +0000 UTC))\\\\\\\"\\\\nI1203 17:00:24.665507 1 secure_serving.go:213] Serving securely on [::]:17697\\\\nI1203 17:00:24.665538 1 genericapiserver.go:683] [graceful-termination] waiting for shutdown to be initiated\\\\nI1203 17:00:24.665554 1 tlsconfig.go:243] \\\\\\\"Starting DynamicServingCertificateController\\\\\\\"\\\\nI1203 17:00:24.665740 1 dynamic_serving_content.go:135] \\\\\\\"Starting controller\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1471736518/tls.crt::/tmp/serving-cert-1471736518/tls.key\\\\\\\"\\\\nF1203 17:00:24.668001 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:08Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.894663 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:27Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://491c806cc8f6fed314f66ec1721e19353f658b73922d572dcb1e46e33be0aae3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38b332b952122175a1a79b69ac644a47aa7030b77c87ae63e885982836c9d28f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.906818 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2d1d9aac-8558-4b49-a650-435b4fb09a09\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0bd64a4799e4437c5c54e58873eab64d16d2bba8f317bb4e1ddd1145a12970d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad9a7802f72d2f590f7320d0151dfc4dc3273979d79c88f7a020a57078ce588b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9l9jp\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:41Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-m2ccv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.920743 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0d509cb4-666a-49ca-a887-6a3a0acb59e1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:00Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://96df3f26a8680d6b5e0018c87bdd1677e6efc076b1c1759134b635d48575ef3f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://61aece78525cb0dbaf703af487d9ef3d7259e7e3684d5386427f25cd50662f3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2279832fcaf60304abf81ce1e519adee74bde6605594e7dfc927f3d0ccc3774\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://006d9f0d06355169184bcc489e92cf688c96b95ee44cea283908a86a5cdd8a36\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:07Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:07Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:06Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.924449 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.924507 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.924526 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.924550 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.924569 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:26Z","lastTransitionTime":"2025-12-03T17:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.932145 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68f8a94a39fe755f643ec72c0710e305704014241fa6889a5382b26d4a4c5c46\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.944964 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-qwsc4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0752d936-15ef-4e17-8463-3185a4c1863b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:01:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:16Z\\\",\\\"message\\\":\\\"2025-12-03T17:00:30+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016\\\\n2025-12-03T17:00:30+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_4271b0ce-8999-4b31-8dc0-79208c01f016 to /host/opt/cni/bin/\\\\n2025-12-03T17:00:31Z [verbose] multus-daemon started\\\\n2025-12-03T17:00:31Z [verbose] Readiness Indicator file check\\\\n2025-12-03T17:01:16Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:29Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:01:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4gc8l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-multus\"/\"multus-qwsc4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:26 crc kubenswrapper[4841]: I1203 17:01:26.963798 4841 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1853b500-b218-4412-9cbc-9fd0a76778c0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T17:00:29Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T17:01:25Z\\\",\\\"message\\\":\\\"vnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:25Z is after 2025-08-24T17:21:41Z]\\\\nI1203 17:01:25.197657 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197666 6834 obj_retry.go:303] Retry object setup: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197670 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI1203 17:01:25.197674 6834 obj_retry.go:365] Adding new object: *v1.Pod openshift-ovn-kubernetes/ovnkube-node-d5svt\\\\nI1203 17:01:25.197681 6834 ovn.go:134] Ensuring zone local for\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T17:01:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T17:00:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T17:00:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T17:00:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xgj9j\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T17:00:29Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-d5svt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T17:01:26Z is after 2025-08-24T17:21:41Z" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.026824 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.026890 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.026963 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.026997 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.027021 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.130110 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.130195 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.130223 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.130257 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.130283 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.232523 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.232574 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.232585 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.232598 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.232608 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.238189 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.238200 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.238300 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:27 crc kubenswrapper[4841]: E1203 17:01:27.238307 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:27 crc kubenswrapper[4841]: E1203 17:01:27.238485 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:27 crc kubenswrapper[4841]: E1203 17:01:27.238669 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.336154 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.336557 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.336775 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.337055 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.337267 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.440103 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.440501 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.440762 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.441008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.441239 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.545228 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.545300 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.545324 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.545353 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.545377 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.648119 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.648179 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.648203 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.648229 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.648265 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.751569 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.751632 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.751653 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.751676 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.751693 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.853720 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.853768 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.853783 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.853801 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.853814 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.955636 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.955678 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.955689 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.955703 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:27 crc kubenswrapper[4841]: I1203 17:01:27.955713 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:27Z","lastTransitionTime":"2025-12-03T17:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.057637 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.057671 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.057679 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.057693 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.057703 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.159951 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.159996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.160008 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.160025 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.160041 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.238467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:28 crc kubenswrapper[4841]: E1203 17:01:28.238590 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.262377 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.262442 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.262465 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.262495 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.262520 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.365691 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.365741 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.365755 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.365773 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.365785 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.468332 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.468600 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.468730 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.468867 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.469024 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.572042 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.572095 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.572109 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.572126 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.572138 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.675682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.676089 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.676254 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.676354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.676446 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.779162 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.779555 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.779647 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.779714 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.779767 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.882268 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.882326 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.882344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.882368 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.882386 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.984756 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.984815 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.984828 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.984847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:28 crc kubenswrapper[4841]: I1203 17:01:28.984859 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:28Z","lastTransitionTime":"2025-12-03T17:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.087847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.087899 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.087931 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.087951 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.087965 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.190536 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.190579 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.190589 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.190606 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.190615 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.238571 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.238669 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:29 crc kubenswrapper[4841]: E1203 17:01:29.238715 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.238775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:29 crc kubenswrapper[4841]: E1203 17:01:29.238931 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:29 crc kubenswrapper[4841]: E1203 17:01:29.238993 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.293399 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.293475 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.293487 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.293530 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.293671 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.396682 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.396726 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.396739 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.396754 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.396765 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.500269 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.500325 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.500336 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.500354 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.500365 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.603943 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.604000 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.604023 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.604051 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.604075 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.707015 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.707064 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.707077 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.707099 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.707110 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.810601 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.810675 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.810686 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.810700 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.810711 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.913373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.913431 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.913444 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.913465 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:29 crc kubenswrapper[4841]: I1203 17:01:29.913478 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:29Z","lastTransitionTime":"2025-12-03T17:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.016140 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.016183 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.016193 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.016208 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.016221 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.118299 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.118344 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.118355 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.118373 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.118384 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.220847 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.220880 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.220891 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.220994 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.221008 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.238806 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.239073 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.275257 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.275490 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275537 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.275502402 +0000 UTC m=+148.663023169 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.275640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275680 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275716 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275738 4841 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.275751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275817 4841 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275843 4841 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275815 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.275788199 +0000 UTC m=+148.663308966 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.275883 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275896 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.275883461 +0000 UTC m=+148.663404218 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.275991 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.275979744 +0000 UTC m=+148.663500511 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.276041 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.276056 4841 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.276064 4841 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:01:30 crc kubenswrapper[4841]: E1203 17:01:30.276088 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.276079896 +0000 UTC m=+148.663600623 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.324031 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.324101 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.324123 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.324151 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.324172 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.426464 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.426519 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.426541 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.426570 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.426592 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.530068 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.530393 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.530598 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.530769 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.530985 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.634849 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.634938 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.634953 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.634971 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.634984 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.739046 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.739096 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.739110 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.739129 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.739145 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.841996 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.842037 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.842049 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.842067 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.842078 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.944926 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.944972 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.944983 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.944998 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:30 crc kubenswrapper[4841]: I1203 17:01:30.945010 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:30Z","lastTransitionTime":"2025-12-03T17:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.026658 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.026711 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.026720 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.026738 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.026747 4841 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T17:01:31Z","lastTransitionTime":"2025-12-03T17:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.072797 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq"] Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.073476 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.078673 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.084134 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.084153 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.084334 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.101224 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=66.101175661 podStartE2EDuration="1m6.101175661s" podCreationTimestamp="2025-12-03 17:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.101064718 +0000 UTC m=+85.488585455" watchObservedRunningTime="2025-12-03 17:01:31.101175661 +0000 UTC m=+85.488696398" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.134855 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-m2ccv" podStartSLOduration=62.134817967 podStartE2EDuration="1m2.134817967s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.134536961 +0000 UTC m=+85.522057688" watchObservedRunningTime="2025-12-03 17:01:31.134817967 +0000 UTC m=+85.522338704" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.169227 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=31.169207251 podStartE2EDuration="31.169207251s" podCreationTimestamp="2025-12-03 17:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.15309478 +0000 UTC m=+85.540615517" watchObservedRunningTime="2025-12-03 17:01:31.169207251 +0000 UTC m=+85.556727988" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.181795 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qwsc4" podStartSLOduration=62.181781929 podStartE2EDuration="1m2.181781929s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.181679927 +0000 UTC m=+85.569200654" watchObservedRunningTime="2025-12-03 17:01:31.181781929 +0000 UTC m=+85.569302666" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.185881 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e7d09d-899d-46a9-9954-e88a6392ac31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.185952 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e7d09d-899d-46a9-9954-e88a6392ac31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.185973 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.186002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e7d09d-899d-46a9-9954-e88a6392ac31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.186037 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.222592 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=66.222561305 podStartE2EDuration="1m6.222561305s" podCreationTimestamp="2025-12-03 17:00:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.222555734 +0000 UTC m=+85.610076501" watchObservedRunningTime="2025-12-03 17:01:31.222561305 +0000 UTC m=+85.610082032" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.238164 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.238214 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.238164 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:31 crc kubenswrapper[4841]: E1203 17:01:31.238487 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:31 crc kubenswrapper[4841]: E1203 17:01:31.238674 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:31 crc kubenswrapper[4841]: E1203 17:01:31.238771 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e7d09d-899d-46a9-9954-e88a6392ac31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287181 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e7d09d-899d-46a9-9954-e88a6392ac31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e7d09d-899d-46a9-9954-e88a6392ac31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.287261 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/84e7d09d-899d-46a9-9954-e88a6392ac31-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.288399 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/84e7d09d-899d-46a9-9954-e88a6392ac31-service-ca\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.293526 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/84e7d09d-899d-46a9-9954-e88a6392ac31-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.319295 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-86vxf" podStartSLOduration=63.319268554 podStartE2EDuration="1m3.319268554s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.31908776 +0000 UTC m=+85.706608487" watchObservedRunningTime="2025-12-03 17:01:31.319268554 +0000 UTC m=+85.706789301" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.326183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/84e7d09d-899d-46a9-9954-e88a6392ac31-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-bw9tq\" (UID: \"84e7d09d-899d-46a9-9954-e88a6392ac31\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.371981 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-qpptb" podStartSLOduration=62.371953461 podStartE2EDuration="1m2.371953461s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.360867689 +0000 UTC m=+85.748388406" watchObservedRunningTime="2025-12-03 17:01:31.371953461 +0000 UTC m=+85.759474178" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.390103 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-bglx8" podStartSLOduration=63.390085621 podStartE2EDuration="1m3.390085621s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.39004743 +0000 UTC m=+85.777568157" watchObservedRunningTime="2025-12-03 17:01:31.390085621 +0000 UTC m=+85.777606338" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.390517 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podStartSLOduration=62.390513311 podStartE2EDuration="1m2.390513311s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:31.381936808 +0000 UTC m=+85.769457535" watchObservedRunningTime="2025-12-03 17:01:31.390513311 +0000 UTC m=+85.778034038" Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.392847 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" Dec 03 17:01:31 crc kubenswrapper[4841]: W1203 17:01:31.407580 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84e7d09d_899d_46a9_9954_e88a6392ac31.slice/crio-c9d0f49c3debb176efdc6ac3e03726c36781c497af7addf9b9d3f7354195a128 WatchSource:0}: Error finding container c9d0f49c3debb176efdc6ac3e03726c36781c497af7addf9b9d3f7354195a128: Status 404 returned error can't find the container with id c9d0f49c3debb176efdc6ac3e03726c36781c497af7addf9b9d3f7354195a128 Dec 03 17:01:31 crc kubenswrapper[4841]: I1203 17:01:31.746248 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" event={"ID":"84e7d09d-899d-46a9-9954-e88a6392ac31","Type":"ContainerStarted","Data":"c9d0f49c3debb176efdc6ac3e03726c36781c497af7addf9b9d3f7354195a128"} Dec 03 17:01:32 crc kubenswrapper[4841]: I1203 17:01:32.238131 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:32 crc kubenswrapper[4841]: E1203 17:01:32.238283 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:32 crc kubenswrapper[4841]: I1203 17:01:32.752384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" event={"ID":"84e7d09d-899d-46a9-9954-e88a6392ac31","Type":"ContainerStarted","Data":"ecdd4692999972a71e66382e413884d2760fb26880b6b479e7e93370afab89bb"} Dec 03 17:01:32 crc kubenswrapper[4841]: I1203 17:01:32.772789 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-bw9tq" podStartSLOduration=63.772766396 podStartE2EDuration="1m3.772766396s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:32.771723601 +0000 UTC m=+87.159244328" watchObservedRunningTime="2025-12-03 17:01:32.772766396 +0000 UTC m=+87.160287123" Dec 03 17:01:33 crc kubenswrapper[4841]: I1203 17:01:33.237762 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:33 crc kubenswrapper[4841]: I1203 17:01:33.237804 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:33 crc kubenswrapper[4841]: E1203 17:01:33.237890 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:33 crc kubenswrapper[4841]: I1203 17:01:33.237924 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:33 crc kubenswrapper[4841]: E1203 17:01:33.238048 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:33 crc kubenswrapper[4841]: E1203 17:01:33.238200 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:34 crc kubenswrapper[4841]: I1203 17:01:34.238341 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:34 crc kubenswrapper[4841]: E1203 17:01:34.238678 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:34 crc kubenswrapper[4841]: I1203 17:01:34.256231 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 17:01:35 crc kubenswrapper[4841]: I1203 17:01:35.238151 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:35 crc kubenswrapper[4841]: I1203 17:01:35.238189 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:35 crc kubenswrapper[4841]: I1203 17:01:35.238189 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:35 crc kubenswrapper[4841]: E1203 17:01:35.238279 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:35 crc kubenswrapper[4841]: E1203 17:01:35.238412 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:35 crc kubenswrapper[4841]: E1203 17:01:35.238480 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:36 crc kubenswrapper[4841]: I1203 17:01:36.238126 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:36 crc kubenswrapper[4841]: E1203 17:01:36.239896 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:36 crc kubenswrapper[4841]: I1203 17:01:36.260284 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=2.260256802 podStartE2EDuration="2.260256802s" podCreationTimestamp="2025-12-03 17:01:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:36.259379601 +0000 UTC m=+90.646900379" watchObservedRunningTime="2025-12-03 17:01:36.260256802 +0000 UTC m=+90.647777559" Dec 03 17:01:37 crc kubenswrapper[4841]: I1203 17:01:37.238219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:37 crc kubenswrapper[4841]: I1203 17:01:37.238260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:37 crc kubenswrapper[4841]: E1203 17:01:37.238401 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:37 crc kubenswrapper[4841]: I1203 17:01:37.238465 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:37 crc kubenswrapper[4841]: E1203 17:01:37.238661 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:37 crc kubenswrapper[4841]: E1203 17:01:37.238808 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:38 crc kubenswrapper[4841]: I1203 17:01:38.238698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:38 crc kubenswrapper[4841]: E1203 17:01:38.238850 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:39 crc kubenswrapper[4841]: I1203 17:01:39.238111 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:39 crc kubenswrapper[4841]: I1203 17:01:39.238147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:39 crc kubenswrapper[4841]: E1203 17:01:39.238226 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:39 crc kubenswrapper[4841]: I1203 17:01:39.238111 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:39 crc kubenswrapper[4841]: E1203 17:01:39.238393 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:39 crc kubenswrapper[4841]: E1203 17:01:39.238579 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:40 crc kubenswrapper[4841]: I1203 17:01:40.238847 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:40 crc kubenswrapper[4841]: E1203 17:01:40.239068 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:41 crc kubenswrapper[4841]: I1203 17:01:41.238579 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:41 crc kubenswrapper[4841]: I1203 17:01:41.238595 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:41 crc kubenswrapper[4841]: I1203 17:01:41.238645 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:41 crc kubenswrapper[4841]: E1203 17:01:41.239865 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:41 crc kubenswrapper[4841]: E1203 17:01:41.240139 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:41 crc kubenswrapper[4841]: E1203 17:01:41.240322 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:41 crc kubenswrapper[4841]: I1203 17:01:41.240675 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:01:41 crc kubenswrapper[4841]: E1203 17:01:41.241111 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:01:42 crc kubenswrapper[4841]: I1203 17:01:42.238275 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:42 crc kubenswrapper[4841]: E1203 17:01:42.238456 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:43 crc kubenswrapper[4841]: I1203 17:01:43.237733 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:43 crc kubenswrapper[4841]: E1203 17:01:43.238339 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:43 crc kubenswrapper[4841]: I1203 17:01:43.237771 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:43 crc kubenswrapper[4841]: I1203 17:01:43.237773 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:43 crc kubenswrapper[4841]: E1203 17:01:43.238720 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:43 crc kubenswrapper[4841]: E1203 17:01:43.238576 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:44 crc kubenswrapper[4841]: I1203 17:01:44.238118 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:44 crc kubenswrapper[4841]: E1203 17:01:44.238684 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:45 crc kubenswrapper[4841]: I1203 17:01:45.238508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:45 crc kubenswrapper[4841]: E1203 17:01:45.238642 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:45 crc kubenswrapper[4841]: I1203 17:01:45.239166 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:45 crc kubenswrapper[4841]: I1203 17:01:45.239238 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:45 crc kubenswrapper[4841]: E1203 17:01:45.239350 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:45 crc kubenswrapper[4841]: E1203 17:01:45.239450 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:46 crc kubenswrapper[4841]: I1203 17:01:46.238294 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:46 crc kubenswrapper[4841]: E1203 17:01:46.239585 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:47 crc kubenswrapper[4841]: I1203 17:01:47.237965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:47 crc kubenswrapper[4841]: I1203 17:01:47.237997 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:47 crc kubenswrapper[4841]: I1203 17:01:47.237978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:47 crc kubenswrapper[4841]: E1203 17:01:47.238167 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:47 crc kubenswrapper[4841]: E1203 17:01:47.238065 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:47 crc kubenswrapper[4841]: E1203 17:01:47.238356 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:47 crc kubenswrapper[4841]: I1203 17:01:47.772059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:47 crc kubenswrapper[4841]: E1203 17:01:47.772249 4841 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:01:47 crc kubenswrapper[4841]: E1203 17:01:47.772329 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs podName:fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99 nodeName:}" failed. No retries permitted until 2025-12-03 17:02:51.772307729 +0000 UTC m=+166.159828496 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs") pod "network-metrics-daemon-fcw2m" (UID: "fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 17:01:48 crc kubenswrapper[4841]: I1203 17:01:48.238217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:48 crc kubenswrapper[4841]: E1203 17:01:48.238739 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:49 crc kubenswrapper[4841]: I1203 17:01:49.238609 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:49 crc kubenswrapper[4841]: I1203 17:01:49.238640 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:49 crc kubenswrapper[4841]: I1203 17:01:49.238711 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:49 crc kubenswrapper[4841]: E1203 17:01:49.238734 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:49 crc kubenswrapper[4841]: E1203 17:01:49.238882 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:49 crc kubenswrapper[4841]: E1203 17:01:49.239070 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:50 crc kubenswrapper[4841]: I1203 17:01:50.238697 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:50 crc kubenswrapper[4841]: E1203 17:01:50.239983 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:50 crc kubenswrapper[4841]: I1203 17:01:50.258820 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 17:01:51 crc kubenswrapper[4841]: I1203 17:01:51.237657 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:51 crc kubenswrapper[4841]: E1203 17:01:51.237773 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:51 crc kubenswrapper[4841]: I1203 17:01:51.238014 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:51 crc kubenswrapper[4841]: E1203 17:01:51.238153 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:51 crc kubenswrapper[4841]: I1203 17:01:51.238343 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:51 crc kubenswrapper[4841]: E1203 17:01:51.238526 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:52 crc kubenswrapper[4841]: I1203 17:01:52.238411 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:52 crc kubenswrapper[4841]: E1203 17:01:52.238519 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:52 crc kubenswrapper[4841]: I1203 17:01:52.239233 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:01:52 crc kubenswrapper[4841]: E1203 17:01:52.239398 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:01:53 crc kubenswrapper[4841]: I1203 17:01:53.238477 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:53 crc kubenswrapper[4841]: I1203 17:01:53.238573 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:53 crc kubenswrapper[4841]: I1203 17:01:53.238601 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:53 crc kubenswrapper[4841]: E1203 17:01:53.238771 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:53 crc kubenswrapper[4841]: E1203 17:01:53.238836 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:53 crc kubenswrapper[4841]: E1203 17:01:53.239022 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:54 crc kubenswrapper[4841]: I1203 17:01:54.238319 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:54 crc kubenswrapper[4841]: E1203 17:01:54.238472 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:55 crc kubenswrapper[4841]: I1203 17:01:55.238187 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:55 crc kubenswrapper[4841]: I1203 17:01:55.238220 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:55 crc kubenswrapper[4841]: E1203 17:01:55.238324 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:55 crc kubenswrapper[4841]: I1203 17:01:55.238485 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:55 crc kubenswrapper[4841]: E1203 17:01:55.238465 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:55 crc kubenswrapper[4841]: E1203 17:01:55.238680 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:56 crc kubenswrapper[4841]: I1203 17:01:56.238536 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:56 crc kubenswrapper[4841]: E1203 17:01:56.240033 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:56 crc kubenswrapper[4841]: I1203 17:01:56.275196 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=6.275154326 podStartE2EDuration="6.275154326s" podCreationTimestamp="2025-12-03 17:01:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:01:56.274575853 +0000 UTC m=+110.662096610" watchObservedRunningTime="2025-12-03 17:01:56.275154326 +0000 UTC m=+110.662675053" Dec 03 17:01:57 crc kubenswrapper[4841]: I1203 17:01:57.238766 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:57 crc kubenswrapper[4841]: I1203 17:01:57.238779 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:57 crc kubenswrapper[4841]: I1203 17:01:57.238943 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:57 crc kubenswrapper[4841]: E1203 17:01:57.239052 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:01:57 crc kubenswrapper[4841]: E1203 17:01:57.239249 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:57 crc kubenswrapper[4841]: E1203 17:01:57.239355 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:58 crc kubenswrapper[4841]: I1203 17:01:58.238322 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:01:58 crc kubenswrapper[4841]: E1203 17:01:58.238477 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:01:59 crc kubenswrapper[4841]: I1203 17:01:59.237997 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:01:59 crc kubenswrapper[4841]: I1203 17:01:59.238035 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:01:59 crc kubenswrapper[4841]: I1203 17:01:59.238002 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:01:59 crc kubenswrapper[4841]: E1203 17:01:59.238162 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:01:59 crc kubenswrapper[4841]: E1203 17:01:59.238210 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:01:59 crc kubenswrapper[4841]: E1203 17:01:59.238262 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:00 crc kubenswrapper[4841]: I1203 17:02:00.238152 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:00 crc kubenswrapper[4841]: E1203 17:02:00.238345 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:01 crc kubenswrapper[4841]: I1203 17:02:01.238775 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:01 crc kubenswrapper[4841]: I1203 17:02:01.238834 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:01 crc kubenswrapper[4841]: E1203 17:02:01.238902 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:01 crc kubenswrapper[4841]: I1203 17:02:01.238949 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:01 crc kubenswrapper[4841]: E1203 17:02:01.239055 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:01 crc kubenswrapper[4841]: E1203 17:02:01.239283 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.237872 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:02 crc kubenswrapper[4841]: E1203 17:02:02.238057 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.860197 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/1.log" Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.861348 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/0.log" Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.861427 4841 generic.go:334] "Generic (PLEG): container finished" podID="0752d936-15ef-4e17-8463-3185a4c1863b" containerID="f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4" exitCode=1 Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.861476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerDied","Data":"f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4"} Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.861536 4841 scope.go:117] "RemoveContainer" containerID="4af1276465dfa4745621713daa963839011a886d70c9006e983ab927e284ead7" Dec 03 17:02:02 crc kubenswrapper[4841]: I1203 17:02:02.862209 4841 scope.go:117] "RemoveContainer" containerID="f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4" Dec 03 17:02:02 crc kubenswrapper[4841]: E1203 17:02:02.862484 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-qwsc4_openshift-multus(0752d936-15ef-4e17-8463-3185a4c1863b)\"" pod="openshift-multus/multus-qwsc4" podUID="0752d936-15ef-4e17-8463-3185a4c1863b" Dec 03 17:02:03 crc kubenswrapper[4841]: I1203 17:02:03.238519 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:03 crc kubenswrapper[4841]: I1203 17:02:03.238625 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:03 crc kubenswrapper[4841]: E1203 17:02:03.238781 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:03 crc kubenswrapper[4841]: I1203 17:02:03.238855 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:03 crc kubenswrapper[4841]: E1203 17:02:03.238976 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:03 crc kubenswrapper[4841]: E1203 17:02:03.239078 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:03 crc kubenswrapper[4841]: I1203 17:02:03.239983 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:02:03 crc kubenswrapper[4841]: E1203 17:02:03.240225 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-d5svt_openshift-ovn-kubernetes(1853b500-b218-4412-9cbc-9fd0a76778c0)\"" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" Dec 03 17:02:03 crc kubenswrapper[4841]: I1203 17:02:03.867539 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/1.log" Dec 03 17:02:04 crc kubenswrapper[4841]: I1203 17:02:04.238431 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:04 crc kubenswrapper[4841]: E1203 17:02:04.238602 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:05 crc kubenswrapper[4841]: I1203 17:02:05.238323 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:05 crc kubenswrapper[4841]: I1203 17:02:05.238361 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:05 crc kubenswrapper[4841]: E1203 17:02:05.238460 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:05 crc kubenswrapper[4841]: I1203 17:02:05.238639 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:05 crc kubenswrapper[4841]: E1203 17:02:05.238778 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:05 crc kubenswrapper[4841]: E1203 17:02:05.238940 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:06 crc kubenswrapper[4841]: I1203 17:02:06.238055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:06 crc kubenswrapper[4841]: E1203 17:02:06.239474 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:06 crc kubenswrapper[4841]: E1203 17:02:06.258051 4841 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 17:02:06 crc kubenswrapper[4841]: E1203 17:02:06.339747 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 17:02:07 crc kubenswrapper[4841]: I1203 17:02:07.238751 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:07 crc kubenswrapper[4841]: I1203 17:02:07.238848 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:07 crc kubenswrapper[4841]: E1203 17:02:07.238964 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:07 crc kubenswrapper[4841]: I1203 17:02:07.239026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:07 crc kubenswrapper[4841]: E1203 17:02:07.239176 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:07 crc kubenswrapper[4841]: E1203 17:02:07.239353 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:08 crc kubenswrapper[4841]: I1203 17:02:08.238582 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:08 crc kubenswrapper[4841]: E1203 17:02:08.238777 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:09 crc kubenswrapper[4841]: I1203 17:02:09.238520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:09 crc kubenswrapper[4841]: I1203 17:02:09.238565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:09 crc kubenswrapper[4841]: I1203 17:02:09.238543 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:09 crc kubenswrapper[4841]: E1203 17:02:09.238677 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:09 crc kubenswrapper[4841]: E1203 17:02:09.238808 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:09 crc kubenswrapper[4841]: E1203 17:02:09.238881 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:10 crc kubenswrapper[4841]: I1203 17:02:10.239172 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:10 crc kubenswrapper[4841]: E1203 17:02:10.239298 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:11 crc kubenswrapper[4841]: I1203 17:02:11.238374 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:11 crc kubenswrapper[4841]: I1203 17:02:11.238428 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:11 crc kubenswrapper[4841]: I1203 17:02:11.238694 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:11 crc kubenswrapper[4841]: E1203 17:02:11.238846 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:11 crc kubenswrapper[4841]: E1203 17:02:11.239107 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:11 crc kubenswrapper[4841]: E1203 17:02:11.239249 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:11 crc kubenswrapper[4841]: E1203 17:02:11.341203 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 17:02:12 crc kubenswrapper[4841]: I1203 17:02:12.238147 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:12 crc kubenswrapper[4841]: E1203 17:02:12.238303 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:13 crc kubenswrapper[4841]: I1203 17:02:13.238101 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:13 crc kubenswrapper[4841]: I1203 17:02:13.238208 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:13 crc kubenswrapper[4841]: I1203 17:02:13.238169 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:13 crc kubenswrapper[4841]: E1203 17:02:13.238330 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:13 crc kubenswrapper[4841]: E1203 17:02:13.238508 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:13 crc kubenswrapper[4841]: E1203 17:02:13.238564 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:14 crc kubenswrapper[4841]: I1203 17:02:14.238895 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:14 crc kubenswrapper[4841]: E1203 17:02:14.239184 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.237808 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.237889 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.238211 4841 scope.go:117] "RemoveContainer" containerID="f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4" Dec 03 17:02:15 crc kubenswrapper[4841]: E1203 17:02:15.238232 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.238251 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:15 crc kubenswrapper[4841]: E1203 17:02:15.238340 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:15 crc kubenswrapper[4841]: E1203 17:02:15.238432 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.919700 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/1.log" Dec 03 17:02:15 crc kubenswrapper[4841]: I1203 17:02:15.919764 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerStarted","Data":"60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05"} Dec 03 17:02:16 crc kubenswrapper[4841]: I1203 17:02:16.238163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:16 crc kubenswrapper[4841]: E1203 17:02:16.238959 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:16 crc kubenswrapper[4841]: E1203 17:02:16.342170 4841 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.238191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:17 crc kubenswrapper[4841]: E1203 17:02:17.238345 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.238418 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.238489 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:17 crc kubenswrapper[4841]: E1203 17:02:17.238931 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:17 crc kubenswrapper[4841]: E1203 17:02:17.239053 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.239458 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.926610 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/3.log" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.928838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerStarted","Data":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.929505 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:02:17 crc kubenswrapper[4841]: I1203 17:02:17.968785 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podStartSLOduration=108.968756514 podStartE2EDuration="1m48.968756514s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:17.968451998 +0000 UTC m=+132.355972725" watchObservedRunningTime="2025-12-03 17:02:17.968756514 +0000 UTC m=+132.356277281" Dec 03 17:02:18 crc kubenswrapper[4841]: I1203 17:02:18.220075 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fcw2m"] Dec 03 17:02:18 crc kubenswrapper[4841]: I1203 17:02:18.220197 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:18 crc kubenswrapper[4841]: E1203 17:02:18.220305 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:18 crc kubenswrapper[4841]: I1203 17:02:18.237992 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:18 crc kubenswrapper[4841]: E1203 17:02:18.238104 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:19 crc kubenswrapper[4841]: I1203 17:02:19.238298 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:19 crc kubenswrapper[4841]: E1203 17:02:19.238421 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:19 crc kubenswrapper[4841]: I1203 17:02:19.238620 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:19 crc kubenswrapper[4841]: E1203 17:02:19.238707 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:20 crc kubenswrapper[4841]: I1203 17:02:20.238665 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:20 crc kubenswrapper[4841]: E1203 17:02:20.238790 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fcw2m" podUID="fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99" Dec 03 17:02:20 crc kubenswrapper[4841]: I1203 17:02:20.238965 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:20 crc kubenswrapper[4841]: E1203 17:02:20.239079 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 17:02:21 crc kubenswrapper[4841]: I1203 17:02:21.238713 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:21 crc kubenswrapper[4841]: I1203 17:02:21.238817 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:21 crc kubenswrapper[4841]: E1203 17:02:21.238965 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 17:02:21 crc kubenswrapper[4841]: E1203 17:02:21.239027 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.238299 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.238397 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.241579 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.242403 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.242510 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.243645 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.522770 4841 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.562984 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.563420 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.564206 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7v9z5"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.564983 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.565642 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2497v"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.566452 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.567009 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q4qff"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.567562 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.567991 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.568362 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.569651 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.570506 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.570776 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-67ndq"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.571437 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.572152 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.572632 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.575959 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.577320 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.577439 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.577532 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.577577 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578040 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578233 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578442 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578559 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578780 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.578888 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.579017 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.579136 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.579738 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.579967 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580001 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580313 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580414 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580429 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580458 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580513 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580573 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580584 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580624 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580587 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580678 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580756 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580781 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580810 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.580958 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.588701 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.591266 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.593753 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.594059 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.594542 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.594993 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.595385 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.611259 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.611348 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.612454 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.612760 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.613358 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.613450 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.613628 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.614230 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.619491 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kq6kf"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.620284 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.620302 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.620295 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.620821 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621077 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621278 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621278 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621501 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621275 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621667 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621396 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.621947 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.622736 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.623287 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.624323 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.624770 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.626954 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9zpn"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.627422 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.628755 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.628809 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.630186 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.630293 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.637114 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.638011 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qgdw"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.638785 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.640682 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zwx96"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.641220 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jghh5"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.641552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.641831 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642345 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2p9l\" (UniqueName: \"kubernetes.io/projected/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-kube-api-access-w2p9l\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchxw\" (UniqueName: \"kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kr4g\" (UniqueName: \"kubernetes.io/projected/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-kube-api-access-4kr4g\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642718 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgppb\" (UniqueName: \"kubernetes.io/projected/c05ce6f5-b89a-458f-ac7d-c297a18822f7-kube-api-access-sgppb\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a47326-6d95-4fea-bf1e-ef38e231e1f3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642814 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c05ce6f5-b89a-458f-ac7d-c297a18822f7-serving-cert\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-config\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-client\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642866 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642885 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642942 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7jmx\" (UniqueName: \"kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642961 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-dir\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.642983 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-node-pullsecrets\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643022 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643040 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-image-import-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643057 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-encryption-config\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643093 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-images\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643121 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-client\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643162 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643181 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwwkx\" (UniqueName: \"kubernetes.io/projected/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-kube-api-access-jwwkx\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643201 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643220 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643253 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643270 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643288 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-machine-approver-tls\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-serving-cert\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643327 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-encryption-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643368 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643391 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-metrics-tls\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643406 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a47326-6d95-4fea-bf1e-ef38e231e1f3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-auth-proxy-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ll4bh\" (UniqueName: \"kubernetes.io/projected/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-kube-api-access-ll4bh\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-serving-cert\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit-dir\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643506 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ff68\" (UniqueName: \"kubernetes.io/projected/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-kube-api-access-4ff68\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643532 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-policies\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq8ht\" (UniqueName: \"kubernetes.io/projected/64a47326-6d95-4fea-bf1e-ef38e231e1f3-kube-api-access-mq8ht\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643567 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-config\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643591 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.643940 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-k26kv"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.644495 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.644896 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.645314 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.645393 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.647510 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.647756 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.648422 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.648567 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.648688 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.648805 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.648982 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.649143 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.649511 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.649776 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.649845 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.650360 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.658824 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.659644 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.660000 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.659750 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.660840 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.661482 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.661833 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.662237 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.664604 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.664740 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.664891 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.665210 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669179 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669490 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669634 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669976 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.670187 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669999 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.669497 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.678159 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.678611 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.695203 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.695573 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.696536 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.696681 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.696799 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.696942 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697044 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697077 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697157 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697247 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697599 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.697859 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mvjc8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.698065 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.698173 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.698389 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.698493 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.699146 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.699411 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.699862 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.700009 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.700518 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.700876 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.708758 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.709087 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.709361 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.709749 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.710654 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.710994 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.713023 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.714828 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.716190 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.720079 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.721406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.722124 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.723047 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.727774 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.728157 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.729969 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.730713 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.731302 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cxw5w"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.731888 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.733120 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.733176 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2497v"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.736742 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.737275 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.737597 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.737765 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.742382 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755068 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755305 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-client\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755357 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwwkx\" (UniqueName: \"kubernetes.io/projected/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-kube-api-access-jwwkx\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755386 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755444 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755463 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-machine-approver-tls\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-serving-cert\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-metrics-tls\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a47326-6d95-4fea-bf1e-ef38e231e1f3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755615 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-auth-proxy-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755631 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-encryption-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ll4bh\" (UniqueName: \"kubernetes.io/projected/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-kube-api-access-ll4bh\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755740 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-serving-cert\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755774 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit-dir\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755809 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ff68\" (UniqueName: \"kubernetes.io/projected/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-kube-api-access-4ff68\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-policies\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq8ht\" (UniqueName: \"kubernetes.io/projected/64a47326-6d95-4fea-bf1e-ef38e231e1f3-kube-api-access-mq8ht\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755924 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-config\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755954 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.755984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0194ebac-1cfb-42eb-910e-7622443b8d15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2p9l\" (UniqueName: \"kubernetes.io/projected/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-kube-api-access-w2p9l\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756042 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756070 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchxw\" (UniqueName: \"kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kr4g\" (UniqueName: \"kubernetes.io/projected/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-kube-api-access-4kr4g\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756170 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgppb\" (UniqueName: \"kubernetes.io/projected/c05ce6f5-b89a-458f-ac7d-c297a18822f7-kube-api-access-sgppb\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a47326-6d95-4fea-bf1e-ef38e231e1f3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756270 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-config\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756288 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-client\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756328 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c05ce6f5-b89a-458f-ac7d-c297a18822f7-serving-cert\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756375 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvjwv\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-kube-api-access-pvjwv\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756432 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7jmx\" (UniqueName: \"kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756458 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-node-pullsecrets\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-image-import-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-encryption-config\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-dir\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756577 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756648 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756697 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-images\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.756746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0194ebac-1cfb-42eb-910e-7622443b8d15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.757189 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.757321 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.757974 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-policies\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.758366 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.758809 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.758838 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-config\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.758997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-node-pullsecrets\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.759645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.759696 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.760360 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-audit-dir\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.761332 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-config\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.761638 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-images\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.761711 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/64a47326-6d95-4fea-bf1e-ef38e231e1f3-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.762050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.762048 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.762446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-auth-proxy-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.762485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.762790 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-audit-dir\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.763394 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.763453 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-serving-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.763706 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-config\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.763755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-image-import-ca\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.764199 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-serving-cert\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.764227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.764627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.775517 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c05ce6f5-b89a-458f-ac7d-c297a18822f7-service-ca-bundle\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.776604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-metrics-tls\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.776645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/64a47326-6d95-4fea-bf1e-ef38e231e1f3-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.776831 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.776868 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-encryption-config\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.776885 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c05ce6f5-b89a-458f-ac7d-c297a18822f7-serving-cert\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-serving-cert\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-etcd-client\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777237 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777388 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-encryption-config\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777615 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7v9z5"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.777673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-machine-approver-tls\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.779129 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.780383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-etcd-client\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.781522 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.781601 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.782014 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tmcv7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.782180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.782846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.782928 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.783456 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q4qff"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.785010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.785398 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.787112 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.792235 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.793656 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.795121 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.796964 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mvjc8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.797554 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.798752 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qgdw"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.799793 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.800820 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.801921 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9zpn"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.803044 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kq6kf"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.804034 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zwx96"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.805051 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.805831 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.806664 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.807688 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.808703 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-67ndq"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.809720 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.810710 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k26kv"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.811693 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-klz55"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.812241 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.813179 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25w5k"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.814091 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.814230 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.815417 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.816428 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.817577 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-klz55"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.818773 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.820041 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.821587 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.822242 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.823303 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tmcv7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.824468 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.825789 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.826058 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.826884 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.827978 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.829159 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.830164 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25w5k"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.830937 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cxw5w"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.831845 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vfkgp"] Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.832342 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.845178 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.857647 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0194ebac-1cfb-42eb-910e-7622443b8d15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.857706 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.857757 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvjwv\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-kube-api-access-pvjwv\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.857818 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0194ebac-1cfb-42eb-910e-7622443b8d15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.859705 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0194ebac-1cfb-42eb-910e-7622443b8d15-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.861237 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/0194ebac-1cfb-42eb-910e-7622443b8d15-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.865559 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.885989 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.905988 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.945362 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.965595 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 17:02:22 crc kubenswrapper[4841]: I1203 17:02:22.985520 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.005319 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.025399 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.045327 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.066320 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.084954 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.105105 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.125973 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.146192 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.165413 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.185614 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.213688 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.226336 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.238620 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.238637 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.245328 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.265676 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.285181 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.304828 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.324836 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.346269 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.364741 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.385749 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.405531 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.425628 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.445993 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.467216 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.487024 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.506511 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.526717 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.545568 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.566368 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.585469 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.605577 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.626097 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.646109 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.667721 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.685968 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.703666 4841 request.go:700] Waited for 1.00253805s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/secrets?fieldSelector=metadata.name%3Dimage-registry-tls&limit=500&resourceVersion=0 Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.705632 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.725678 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.746693 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.766118 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.785933 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.806342 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.826259 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.846372 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.886500 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.906469 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.926485 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.946057 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.966136 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 17:02:23 crc kubenswrapper[4841]: I1203 17:02:23.985987 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.006155 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.026361 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.045663 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.066028 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.090049 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.106703 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.125681 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.146628 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.166260 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.186154 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.221480 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwwkx\" (UniqueName: \"kubernetes.io/projected/666fd1d7-2a05-4ed2-814e-4fe4c30f5e52-kube-api-access-jwwkx\") pod \"dns-operator-744455d44c-67ndq\" (UID: \"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52\") " pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.228252 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.239838 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ff68\" (UniqueName: \"kubernetes.io/projected/9bf1c45d-ffb9-423e-bdea-7e2d209a47d1-kube-api-access-4ff68\") pod \"machine-api-operator-5694c8668f-q4qff\" (UID: \"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.248549 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.280392 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7jmx\" (UniqueName: \"kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx\") pod \"route-controller-manager-6576b87f9c-59tlz\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.299686 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchxw\" (UniqueName: \"kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw\") pod \"controller-manager-879f6c89f-mj44m\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.319137 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kr4g\" (UniqueName: \"kubernetes.io/projected/671b18a0-95da-4c17-9ef5-4b0dc243ff4f-kube-api-access-4kr4g\") pod \"apiserver-76f77b778f-7v9z5\" (UID: \"671b18a0-95da-4c17-9ef5-4b0dc243ff4f\") " pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.339404 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ll4bh\" (UniqueName: \"kubernetes.io/projected/c19b6ad2-27ad-4ad0-91ed-d44eb02011ac-kube-api-access-ll4bh\") pod \"machine-approver-56656f9798-5qpx2\" (UID: \"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.362546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgppb\" (UniqueName: \"kubernetes.io/projected/c05ce6f5-b89a-458f-ac7d-c297a18822f7-kube-api-access-sgppb\") pod \"authentication-operator-69f744f599-2497v\" (UID: \"c05ce6f5-b89a-458f-ac7d-c297a18822f7\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.378559 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2p9l\" (UniqueName: \"kubernetes.io/projected/b38ec4ff-67eb-467c-97d3-efbc96c8b4d7-kube-api-access-w2p9l\") pod \"apiserver-7bbb656c7d-q28gm\" (UID: \"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.389819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.400350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq8ht\" (UniqueName: \"kubernetes.io/projected/64a47326-6d95-4fea-bf1e-ef38e231e1f3-kube-api-access-mq8ht\") pod \"openshift-controller-manager-operator-756b6f6bc6-888hd\" (UID: \"64a47326-6d95-4fea-bf1e-ef38e231e1f3\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.408392 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.416093 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.425452 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.429705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.442791 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.443566 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-67ndq"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.450695 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.465528 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.486118 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.496751 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.505285 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.508123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.526311 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.542334 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.546172 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.557586 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.568363 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.578918 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.585979 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.606211 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.606593 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7v9z5"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.627603 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: W1203 17:02:24.635025 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod671b18a0_95da_4c17_9ef5_4b0dc243ff4f.slice/crio-1f1874791ed864621610451568435ab501ee86126d5857495b820681f8470efe WatchSource:0}: Error finding container 1f1874791ed864621610451568435ab501ee86126d5857495b820681f8470efe: Status 404 returned error can't find the container with id 1f1874791ed864621610451568435ab501ee86126d5857495b820681f8470efe Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.645505 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.666009 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.687834 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.704730 4841 request.go:700] Waited for 1.890451282s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/hostpath-provisioner/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.706590 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.727984 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.745564 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.746019 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 17:02:24 crc kubenswrapper[4841]: W1203 17:02:24.758239 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd9368a6_dc6b_42fd_9062_a01612ceb28c.slice/crio-11de0bd5c737d7a708f0367cab344c6d108fd155bc7c8f8bafc96be0c4f3dabf WatchSource:0}: Error finding container 11de0bd5c737d7a708f0367cab344c6d108fd155bc7c8f8bafc96be0c4f3dabf: Status 404 returned error can't find the container with id 11de0bd5c737d7a708f0367cab344c6d108fd155bc7c8f8bafc96be0c4f3dabf Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.768168 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.778044 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.789385 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.819757 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.826956 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.845499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvjwv\" (UniqueName: \"kubernetes.io/projected/0194ebac-1cfb-42eb-910e-7622443b8d15-kube-api-access-pvjwv\") pod \"cluster-image-registry-operator-dc59b4c8b-dkcrc\" (UID: \"0194ebac-1cfb-42eb-910e-7622443b8d15\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.852563 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-2497v"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.862889 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-q4qff"] Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.865997 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.885392 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.895599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.957330 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" event={"ID":"671b18a0-95da-4c17-9ef5-4b0dc243ff4f","Type":"ContainerStarted","Data":"1f1874791ed864621610451568435ab501ee86126d5857495b820681f8470efe"} Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.958311 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" event={"ID":"876346bb-a538-4b29-a71f-6ca64d8b60f0","Type":"ContainerStarted","Data":"811ad334cfe9d7f26f06b65278eec7890241fc0cec2d95c00ae66cc23298e260"} Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.959389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" event={"ID":"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52","Type":"ContainerStarted","Data":"686efa49462e56344a80665987685bba513e6a198dc0233cc573b11e4b35049f"} Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.960397 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" event={"ID":"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac","Type":"ContainerStarted","Data":"a4cdbbff1e41722a3532b737afed94f8eaec71ad9e5d0281c666b423cc5b5e90"} Dec 03 17:02:24 crc kubenswrapper[4841]: I1203 17:02:24.961658 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" event={"ID":"bd9368a6-dc6b-42fd-9062-a01612ceb28c","Type":"ContainerStarted","Data":"11de0bd5c737d7a708f0367cab344c6d108fd155bc7c8f8bafc96be0c4f3dabf"} Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.453677 4841 request.go:700] Waited for 1.141002366s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-console-operator/pods/console-operator-58897d9998-kq6kf Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.458960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.459017 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bjf8\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.459054 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.459080 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.459536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.460483 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:26.960470752 +0000 UTC m=+141.347991479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.460887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.461013 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.461124 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562014 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562191 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4261fd00-56f9-4b61-8f68-23092f89b47f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562238 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562264 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562353 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-client\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562413 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6s6s\" (UniqueName: \"kubernetes.io/projected/b95878a7-fa1d-4d27-a660-a80646f1c8db-kube-api-access-c6s6s\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562443 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562486 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562546 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4261fd00-56f9-4b61-8f68-23092f89b47f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562578 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562684 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a95f13d-5349-4097-8482-88efaa760147-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-config\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562753 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-srv-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt4wf\" (UniqueName: \"kubernetes.io/projected/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-kube-api-access-kt4wf\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562847 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-metrics-certs\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562869 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f720c5-d60c-449f-8c98-8295bd17c472-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-service-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.562971 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34704048-bc3f-4744-a1ec-2fc6176c9e61-trusted-ca\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563002 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92178376-5eeb-4fdd-948e-b6c7beaa25e1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563034 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563063 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/02cf73a4-150b-4a14-9e46-6e986b38304f-kube-api-access-c5q25\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563106 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f720c5-d60c-449f-8c98-8295bd17c472-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-webhook-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563166 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563194 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrgq\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-kube-api-access-ngrgq\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563258 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqx45\" (UniqueName: \"kubernetes.io/projected/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-kube-api-access-cqx45\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563290 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4f720c5-d60c-449f-8c98-8295bd17c472-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563340 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563359 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-trusted-ca\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563400 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92178376-5eeb-4fdd-948e-b6c7beaa25e1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-stats-auth\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563471 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b966f8b3-646c-4a50-a979-a65a815947e8-config\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563526 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563558 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25fb107-dace-4acc-8cac-e73b2788db2f-serving-cert\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563595 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a95f13d-5349-4097-8482-88efaa760147-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563628 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-default-certificate\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b95878a7-fa1d-4d27-a660-a80646f1c8db-tmpfs\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563695 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfwsn\" (UniqueName: \"kubernetes.io/projected/cc498551-0214-4646-bebe-1129a989142c-kube-api-access-bfwsn\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563717 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfk4n\" (UniqueName: \"kubernetes.io/projected/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-kube-api-access-nfk4n\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bjf8\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563813 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b966f8b3-646c-4a50-a979-a65a815947e8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.563985 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-config\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564021 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cf73a4-150b-4a14-9e46-6e986b38304f-serving-cert\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564109 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a95f13d-5349-4097-8482-88efaa760147-config\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564178 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564210 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtz45\" (UniqueName: \"kubernetes.io/projected/92178376-5eeb-4fdd-948e-b6c7beaa25e1-kube-api-access-rtz45\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564262 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8qxn\" (UniqueName: \"kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.564295 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.564834 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.064594237 +0000 UTC m=+141.452114994 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.566996 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.567196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34704048-bc3f-4744-a1ec-2fc6176c9e61-metrics-tls\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.567439 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.567780 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.568583 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570101 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570379 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570461 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cvw8\" (UniqueName: \"kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrdb\" (UniqueName: \"kubernetes.io/projected/ecf0dd5d-5164-4c5f-a4b5-e394182adc25-kube-api-access-mnrdb\") pod \"downloads-7954f5f757-k26kv\" (UID: \"ecf0dd5d-5164-4c5f-a4b5-e394182adc25\") " pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gnkf\" (UniqueName: \"kubernetes.io/projected/f762a14e-29c3-4f5d-b051-60431de11a82-kube-api-access-6gnkf\") pod \"migrator-59844c95c7-7dnfb\" (UID: \"f762a14e-29c3-4f5d-b051-60431de11a82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj9kj\" (UniqueName: \"kubernetes.io/projected/1f113f14-eee2-482f-9142-feac7fb442de-kube-api-access-xj9kj\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570642 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f113f14-eee2-482f-9142-feac7fb442de-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570666 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570694 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlfh8\" (UniqueName: \"kubernetes.io/projected/4261fd00-56f9-4b61-8f68-23092f89b47f-kube-api-access-nlfh8\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-profile-collector-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.570783 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.571695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.577258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578015 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvfbc\" (UniqueName: \"kubernetes.io/projected/a25fb107-dace-4acc-8cac-e73b2788db2f-kube-api-access-cvfbc\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578114 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f113f14-eee2-482f-9142-feac7fb442de-proxy-tls\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b966f8b3-646c-4a50-a979-a65a815947e8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578234 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzprg\" (UniqueName: \"kubernetes.io/projected/9fdbc19c-851c-4465-ac8f-e86caca57814-kube-api-access-dzprg\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02cf73a4-150b-4a14-9e46-6e986b38304f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-serving-cert\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc498551-0214-4646-bebe-1129a989142c-service-ca-bundle\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578463 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.578495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.586894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.591806 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bjf8\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqx45\" (UniqueName: \"kubernetes.io/projected/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-kube-api-access-cqx45\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681352 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a1b67a-3b53-4c3d-b352-9647a312efbb-cert\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681399 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4f720c5-d60c-449f-8c98-8295bd17c472-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681519 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52ead505-bb1a-4256-8986-20eea1433dac-signing-cabundle\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z658g\" (UniqueName: \"kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.681969 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.682540 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-trusted-ca\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.683879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-trusted-ca\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.682571 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-profile-collector-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684033 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5hrm\" (UniqueName: \"kubernetes.io/projected/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-kube-api-access-r5hrm\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92178376-5eeb-4fdd-948e-b6c7beaa25e1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-stats-auth\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b966f8b3-646c-4a50-a979-a65a815947e8-config\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684836 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/92178376-5eeb-4fdd-948e-b6c7beaa25e1-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.684161 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.685158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac141b85-7778-4866-8dee-ee96cbc5f6b7-serving-cert\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.685185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25fb107-dace-4acc-8cac-e73b2788db2f-serving-cert\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687173 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.685307 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b966f8b3-646c-4a50-a979-a65a815947e8-config\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687278 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a95f13d-5349-4097-8482-88efaa760147-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687534 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b95878a7-fa1d-4d27-a660-a80646f1c8db-tmpfs\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-default-certificate\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687607 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfwsn\" (UniqueName: \"kubernetes.io/projected/cc498551-0214-4646-bebe-1129a989142c-kube-api-access-bfwsn\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687633 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfk4n\" (UniqueName: \"kubernetes.io/projected/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-kube-api-access-nfk4n\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687681 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-srv-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687724 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687750 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b966f8b3-646c-4a50-a979-a65a815947e8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687775 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mp8nm\" (UniqueName: \"kubernetes.io/projected/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-kube-api-access-mp8nm\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-config\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687824 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cf73a4-150b-4a14-9e46-6e986b38304f-serving-cert\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687860 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a95f13d-5349-4097-8482-88efaa760147-config\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687961 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtz45\" (UniqueName: \"kubernetes.io/projected/92178376-5eeb-4fdd-948e-b6c7beaa25e1-kube-api-access-rtz45\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8qxn\" (UniqueName: \"kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbkw\" (UniqueName: \"kubernetes.io/projected/59a1b67a-3b53-4c3d-b352-9647a312efbb-kube-api-access-lqbkw\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688065 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-csi-data-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688091 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688116 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688146 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34704048-bc3f-4744-a1ec-2fc6176c9e61-metrics-tls\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688191 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cvw8\" (UniqueName: \"kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688247 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-mountpoint-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688277 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrdb\" (UniqueName: \"kubernetes.io/projected/ecf0dd5d-5164-4c5f-a4b5-e394182adc25-kube-api-access-mnrdb\") pod \"downloads-7954f5f757-k26kv\" (UID: \"ecf0dd5d-5164-4c5f-a4b5-e394182adc25\") " pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gnkf\" (UniqueName: \"kubernetes.io/projected/f762a14e-29c3-4f5d-b051-60431de11a82-kube-api-access-6gnkf\") pod \"migrator-59844c95c7-7dnfb\" (UID: \"f762a14e-29c3-4f5d-b051-60431de11a82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj9kj\" (UniqueName: \"kubernetes.io/projected/1f113f14-eee2-482f-9142-feac7fb442de-kube-api-access-xj9kj\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688372 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f113f14-eee2-482f-9142-feac7fb442de-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688398 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlfh8\" (UniqueName: \"kubernetes.io/projected/4261fd00-56f9-4b61-8f68-23092f89b47f-kube-api-access-nlfh8\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688447 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk99j\" (UniqueName: \"kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtw9k\" (UniqueName: \"kubernetes.io/projected/443df56d-afb1-4463-a11f-4de38331f234-kube-api-access-vtw9k\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-profile-collector-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9323acaa-2ebf-4776-824e-8e532a9f3b62-proxy-tls\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688574 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688598 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvfbc\" (UniqueName: \"kubernetes.io/projected/a25fb107-dace-4acc-8cac-e73b2788db2f-kube-api-access-cvfbc\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688623 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfpqg\" (UniqueName: \"kubernetes.io/projected/0a53af00-4d81-4217-a154-052396aac6dd-kube-api-access-zfpqg\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688649 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f113f14-eee2-482f-9142-feac7fb442de-proxy-tls\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688673 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-plugins-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688736 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b966f8b3-646c-4a50-a979-a65a815947e8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688759 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzprg\" (UniqueName: \"kubernetes.io/projected/9fdbc19c-851c-4465-ac8f-e86caca57814-kube-api-access-dzprg\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688783 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02cf73a4-150b-4a14-9e46-6e986b38304f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-registration-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c177bc13-030a-442f-9efa-603b4620a2c8-metrics-tls\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688851 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-serving-cert\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688896 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc498551-0214-4646-bebe-1129a989142c-service-ca-bundle\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.688990 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52ead505-bb1a-4256-8986-20eea1433dac-signing-key\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4261fd00-56f9-4b61-8f68-23092f89b47f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689046 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6q5m\" (UniqueName: \"kubernetes.io/projected/ac141b85-7778-4866-8dee-ee96cbc5f6b7-kube-api-access-k6q5m\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689071 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689094 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmz29\" (UniqueName: \"kubernetes.io/projected/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-kube-api-access-wmz29\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qx2k\" (UniqueName: \"kubernetes.io/projected/c177bc13-030a-442f-9efa-603b4620a2c8-kube-api-access-2qx2k\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689165 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-client\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6s6s\" (UniqueName: \"kubernetes.io/projected/b95878a7-fa1d-4d27-a660-a80646f1c8db-kube-api-access-c6s6s\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689259 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4261fd00-56f9-4b61-8f68-23092f89b47f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689333 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac141b85-7778-4866-8dee-ee96cbc5f6b7-config\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c177bc13-030a-442f-9efa-603b4620a2c8-config-volume\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689401 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94plm\" (UniqueName: \"kubernetes.io/projected/9323acaa-2ebf-4776-824e-8e532a9f3b62-kube-api-access-94plm\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689423 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-socket-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689445 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-images\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689486 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a95f13d-5349-4097-8482-88efaa760147-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689532 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-config\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689583 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-srv-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-node-bootstrap-token\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689647 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-metrics-certs\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689672 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt4wf\" (UniqueName: \"kubernetes.io/projected/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-kube-api-access-kt4wf\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f720c5-d60c-449f-8c98-8295bd17c472-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689720 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92178376-5eeb-4fdd-948e-b6c7beaa25e1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-certs\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689764 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-service-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689786 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34704048-bc3f-4744-a1ec-2fc6176c9e61-trusted-ca\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689810 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6n6rw\" (UniqueName: \"kubernetes.io/projected/52ead505-bb1a-4256-8986-20eea1433dac-kube-api-access-6n6rw\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689836 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689859 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrgq\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-kube-api-access-ngrgq\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/02cf73a4-150b-4a14-9e46-6e986b38304f-kube-api-access-c5q25\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689924 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.689972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f720c5-d60c-449f-8c98-8295bd17c472-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.690003 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-webhook-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.690027 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.690711 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.687104 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.693329 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/b95878a7-fa1d-4d27-a660-a80646f1c8db-tmpfs\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.698091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a25fb107-dace-4acc-8cac-e73b2788db2f-serving-cert\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.698609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-stats-auth\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.699477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-default-certificate\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.703916 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-serving-cert\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.705335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.705430 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.706163 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.20614541 +0000 UTC m=+141.593666137 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.706344 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4261fd00-56f9-4b61-8f68-23092f89b47f-config\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.708207 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.708979 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.709034 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfwsn\" (UniqueName: \"kubernetes.io/projected/cc498551-0214-4646-bebe-1129a989142c-kube-api-access-bfwsn\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.709837 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1f113f14-eee2-482f-9142-feac7fb442de-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.710208 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.711121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.711856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cc498551-0214-4646-bebe-1129a989142c-service-ca-bundle\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.712880 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4f720c5-d60c-449f-8c98-8295bd17c472-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.714214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.714341 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a25fb107-dace-4acc-8cac-e73b2788db2f-config\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.715289 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.715612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-client\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.716379 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.716977 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a95f13d-5349-4097-8482-88efaa760147-config\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.717825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfk4n\" (UniqueName: \"kubernetes.io/projected/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-kube-api-access-nfk4n\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.718358 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.718849 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.719556 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1a95f13d-5349-4097-8482-88efaa760147-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.720006 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.720382 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.720815 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1a95f13d-5349-4097-8482-88efaa760147-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-rx2p2\" (UID: \"1a95f13d-5349-4097-8482-88efaa760147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.720862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b966f8b3-646c-4a50-a979-a65a815947e8-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.721542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.721821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.722559 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-config\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.722826 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.724743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/02cf73a4-150b-4a14-9e46-6e986b38304f-available-featuregates\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.730293 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzprg\" (UniqueName: \"kubernetes.io/projected/9fdbc19c-851c-4465-ac8f-e86caca57814-kube-api-access-dzprg\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.730792 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ec55af94-b794-47c8-b6bd-ce686ad2f9a4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-2jhj8\" (UID: \"ec55af94-b794-47c8-b6bd-ce686ad2f9a4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.731782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-webhook-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.732695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/92178376-5eeb-4fdd-948e-b6c7beaa25e1-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.733994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8qxn\" (UniqueName: \"kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.735019 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-bound-sa-token\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.735278 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlfh8\" (UniqueName: \"kubernetes.io/projected/4261fd00-56f9-4b61-8f68-23092f89b47f-kube-api-access-nlfh8\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.735674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b95878a7-fa1d-4d27-a660-a80646f1c8db-apiservice-cert\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.735776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/34704048-bc3f-4744-a1ec-2fc6176c9e61-trusted-ca\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.736259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/9fdbc19c-851c-4465-ac8f-e86caca57814-etcd-service-ca\") pod \"etcd-operator-b45778765-mvjc8\" (UID: \"9fdbc19c-851c-4465-ac8f-e86caca57814\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.736802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrdb\" (UniqueName: \"kubernetes.io/projected/ecf0dd5d-5164-4c5f-a4b5-e394182adc25-kube-api-access-mnrdb\") pod \"downloads-7954f5f757-k26kv\" (UID: \"ecf0dd5d-5164-4c5f-a4b5-e394182adc25\") " pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4f720c5-d60c-449f-8c98-8295bd17c472-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737405 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4f720c5-d60c-449f-8c98-8295bd17c472-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-66nbw\" (UID: \"b4f720c5-d60c-449f-8c98-8295bd17c472\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvfbc\" (UniqueName: \"kubernetes.io/projected/a25fb107-dace-4acc-8cac-e73b2788db2f-kube-api-access-cvfbc\") pod \"console-operator-58897d9998-kq6kf\" (UID: \"a25fb107-dace-4acc-8cac-e73b2788db2f\") " pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737700 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1f113f14-eee2-482f-9142-feac7fb442de-proxy-tls\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b966f8b3-646c-4a50-a979-a65a815947e8-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qntpv\" (UID: \"b966f8b3-646c-4a50-a979-a65a815947e8\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738166 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqx45\" (UniqueName: \"kubernetes.io/projected/9b021c9d-ee1d-4b78-bb6f-08303c69ebee-kube-api-access-cqx45\") pod \"multus-admission-controller-857f4d67dd-8qgdw\" (UID: \"9b021c9d-ee1d-4b78-bb6f-08303c69ebee\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738175 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/02cf73a4-150b-4a14-9e46-6e986b38304f-serving-cert\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738554 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt4wf\" (UniqueName: \"kubernetes.io/projected/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-kube-api-access-kt4wf\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-srv-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738588 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.737900 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtz45\" (UniqueName: \"kubernetes.io/projected/92178376-5eeb-4fdd-948e-b6c7beaa25e1-kube-api-access-rtz45\") pod \"kube-storage-version-migrator-operator-b67b599dd-xtkh4\" (UID: \"92178376-5eeb-4fdd-948e-b6c7beaa25e1\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.738920 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cc498551-0214-4646-bebe-1129a989142c-metrics-certs\") pod \"router-default-5444994796-jghh5\" (UID: \"cc498551-0214-4646-bebe-1129a989142c\") " pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.739958 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.740206 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config\") pod \"console-f9d7485db-ngr75\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.740618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj9kj\" (UniqueName: \"kubernetes.io/projected/1f113f14-eee2-482f-9142-feac7fb442de-kube-api-access-xj9kj\") pod \"machine-config-controller-84d6567774-j79s8\" (UID: \"1f113f14-eee2-482f-9142-feac7fb442de\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.743733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/34704048-bc3f-4744-a1ec-2fc6176c9e61-metrics-tls\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.744211 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/59aa053f-a2a4-4af0-ae8f-bf2e65ee406c-profile-collector-cert\") pod \"catalog-operator-68c6474976-mt7kx\" (UID: \"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.744630 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4261fd00-56f9-4b61-8f68-23092f89b47f-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-bk6pj\" (UID: \"4261fd00-56f9-4b61-8f68-23092f89b47f\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.745165 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrgq\" (UniqueName: \"kubernetes.io/projected/34704048-bc3f-4744-a1ec-2fc6176c9e61-kube-api-access-ngrgq\") pod \"ingress-operator-5b745b69d9-7j2mh\" (UID: \"34704048-bc3f-4744-a1ec-2fc6176c9e61\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.745660 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cvw8\" (UniqueName: \"kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8\") pod \"oauth-openshift-558db77b4-v9zpn\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.746862 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gnkf\" (UniqueName: \"kubernetes.io/projected/f762a14e-29c3-4f5d-b051-60431de11a82-kube-api-access-6gnkf\") pod \"migrator-59844c95c7-7dnfb\" (UID: \"f762a14e-29c3-4f5d-b051-60431de11a82\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.747244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6s6s\" (UniqueName: \"kubernetes.io/projected/b95878a7-fa1d-4d27-a660-a80646f1c8db-kube-api-access-c6s6s\") pod \"packageserver-d55dfcdfc-k8hdg\" (UID: \"b95878a7-fa1d-4d27-a660-a80646f1c8db\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.748035 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc"] Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.748184 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.748214 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5q25\" (UniqueName: \"kubernetes.io/projected/02cf73a4-150b-4a14-9e46-6e986b38304f-kube-api-access-c5q25\") pod \"openshift-config-operator-7777fb866f-zwx96\" (UID: \"02cf73a4-150b-4a14-9e46-6e986b38304f\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.761023 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.777525 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.787540 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.790930 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.791123 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.291100755 +0000 UTC m=+141.678621482 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791228 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbkw\" (UniqueName: \"kubernetes.io/projected/59a1b67a-3b53-4c3d-b352-9647a312efbb-kube-api-access-lqbkw\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-csi-data-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791387 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791461 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-mountpoint-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk99j\" (UniqueName: \"kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791683 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtw9k\" (UniqueName: \"kubernetes.io/projected/443df56d-afb1-4463-a11f-4de38331f234-kube-api-access-vtw9k\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-csi-data-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791895 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9323acaa-2ebf-4776-824e-8e532a9f3b62-proxy-tls\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792016 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfpqg\" (UniqueName: \"kubernetes.io/projected/0a53af00-4d81-4217-a154-052396aac6dd-kube-api-access-zfpqg\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792570 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-plugins-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792712 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-registration-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c177bc13-030a-442f-9efa-603b4620a2c8-metrics-tls\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6q5m\" (UniqueName: \"kubernetes.io/projected/ac141b85-7778-4866-8dee-ee96cbc5f6b7-kube-api-access-k6q5m\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52ead505-bb1a-4256-8986-20eea1433dac-signing-key\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793212 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmz29\" (UniqueName: \"kubernetes.io/projected/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-kube-api-access-wmz29\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793297 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-registration-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qx2k\" (UniqueName: \"kubernetes.io/projected/c177bc13-030a-442f-9efa-603b4620a2c8-kube-api-access-2qx2k\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793396 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac141b85-7778-4866-8dee-ee96cbc5f6b7-config\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793416 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c177bc13-030a-442f-9efa-603b4620a2c8-config-volume\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-images\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793454 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94plm\" (UniqueName: \"kubernetes.io/projected/9323acaa-2ebf-4776-824e-8e532a9f3b62-kube-api-access-94plm\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793471 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-socket-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793519 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-node-bootstrap-token\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793535 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-certs\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793584 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6n6rw\" (UniqueName: \"kubernetes.io/projected/52ead505-bb1a-4256-8986-20eea1433dac-kube-api-access-6n6rw\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793607 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793626 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-socket-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793632 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a1b67a-3b53-4c3d-b352-9647a312efbb-cert\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793695 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52ead505-bb1a-4256-8986-20eea1433dac-signing-cabundle\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793711 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z658g\" (UniqueName: \"kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-profile-collector-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5hrm\" (UniqueName: \"kubernetes.io/projected/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-kube-api-access-r5hrm\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793763 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac141b85-7778-4866-8dee-ee96cbc5f6b7-serving-cert\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793782 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793821 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-srv-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.793839 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mp8nm\" (UniqueName: \"kubernetes.io/projected/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-kube-api-access-mp8nm\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.794297 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.294284256 +0000 UTC m=+141.681804983 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.791894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-mountpoint-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-auth-proxy-config\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.796153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c177bc13-030a-442f-9efa-603b4620a2c8-config-volume\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.796783 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9323acaa-2ebf-4776-824e-8e532a9f3b62-images\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.796798 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9323acaa-2ebf-4776-824e-8e532a9f3b62-proxy-tls\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.792795 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.798217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/52ead505-bb1a-4256-8986-20eea1433dac-signing-cabundle\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.799147 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.799510 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.801375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac141b85-7778-4866-8dee-ee96cbc5f6b7-config\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.801430 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/443df56d-afb1-4463-a11f-4de38331f234-plugins-dir\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.803458 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.807300 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-srv-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.808363 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.808733 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59a1b67a-3b53-4c3d-b352-9647a312efbb-cert\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.809250 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.811761 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/52ead505-bb1a-4256-8986-20eea1433dac-signing-key\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.812544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.814507 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-node-bootstrap-token\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.815193 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/0a53af00-4d81-4217-a154-052396aac6dd-certs\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.815244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.815339 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac141b85-7778-4866-8dee-ee96cbc5f6b7-serving-cert\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.815740 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c177bc13-030a-442f-9efa-603b4620a2c8-metrics-tls\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.816582 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.818435 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-profile-collector-cert\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.818663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbkw\" (UniqueName: \"kubernetes.io/projected/59a1b67a-3b53-4c3d-b352-9647a312efbb-kube-api-access-lqbkw\") pod \"ingress-canary-klz55\" (UID: \"59a1b67a-3b53-4c3d-b352-9647a312efbb\") " pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.822253 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.823758 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmz29\" (UniqueName: \"kubernetes.io/projected/4c07ec09-68a5-4c56-a97a-5eb0a73a020d-kube-api-access-wmz29\") pod \"control-plane-machine-set-operator-78cbb6b69f-ng62w\" (UID: \"4c07ec09-68a5-4c56-a97a-5eb0a73a020d\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.824390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z658g\" (UniqueName: \"kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g\") pod \"collect-profiles-29413020-pdv9l\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.824742 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.825003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6n6rw\" (UniqueName: \"kubernetes.io/projected/52ead505-bb1a-4256-8986-20eea1433dac-kube-api-access-6n6rw\") pod \"service-ca-9c57cc56f-cxw5w\" (UID: \"52ead505-bb1a-4256-8986-20eea1433dac\") " pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.825438 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mp8nm\" (UniqueName: \"kubernetes.io/projected/a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4-kube-api-access-mp8nm\") pod \"package-server-manager-789f6589d5-zj64h\" (UID: \"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.825572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94plm\" (UniqueName: \"kubernetes.io/projected/9323acaa-2ebf-4776-824e-8e532a9f3b62-kube-api-access-94plm\") pod \"machine-config-operator-74547568cd-2v7f7\" (UID: \"9323acaa-2ebf-4776-824e-8e532a9f3b62\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.825669 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfpqg\" (UniqueName: \"kubernetes.io/projected/0a53af00-4d81-4217-a154-052396aac6dd-kube-api-access-zfpqg\") pod \"machine-config-server-vfkgp\" (UID: \"0a53af00-4d81-4217-a154-052396aac6dd\") " pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.826499 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6q5m\" (UniqueName: \"kubernetes.io/projected/ac141b85-7778-4866-8dee-ee96cbc5f6b7-kube-api-access-k6q5m\") pod \"service-ca-operator-777779d784-tmx4h\" (UID: \"ac141b85-7778-4866-8dee-ee96cbc5f6b7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.826555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtw9k\" (UniqueName: \"kubernetes.io/projected/443df56d-afb1-4463-a11f-4de38331f234-kube-api-access-vtw9k\") pod \"csi-hostpathplugin-25w5k\" (UID: \"443df56d-afb1-4463-a11f-4de38331f234\") " pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.826708 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qx2k\" (UniqueName: \"kubernetes.io/projected/c177bc13-030a-442f-9efa-603b4620a2c8-kube-api-access-2qx2k\") pod \"dns-default-tmcv7\" (UID: \"c177bc13-030a-442f-9efa-603b4620a2c8\") " pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.827567 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5hrm\" (UniqueName: \"kubernetes.io/projected/292c8d77-02e4-4a9a-8f4c-3cd1acd07907-kube-api-access-r5hrm\") pod \"olm-operator-6b444d44fb-44m78\" (UID: \"292c8d77-02e4-4a9a-8f4c-3cd1acd07907\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.828784 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk99j\" (UniqueName: \"kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j\") pod \"marketplace-operator-79b997595-p6dg7\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.829032 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.842819 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" Dec 03 17:02:26 crc kubenswrapper[4841]: W1203 17:02:26.850381 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc498551_0214_4646_bebe_1129a989142c.slice/crio-c32afd4da32598c485a64ea12e144abf29cf0f29a932a6d4f2af9da0d6c04ae7 WatchSource:0}: Error finding container c32afd4da32598c485a64ea12e144abf29cf0f29a932a6d4f2af9da0d6c04ae7: Status 404 returned error can't find the container with id c32afd4da32598c485a64ea12e144abf29cf0f29a932a6d4f2af9da0d6c04ae7 Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.851307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.857286 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.864348 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.871124 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.879073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.887983 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.893968 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.898172 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.898357 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.3983315 +0000 UTC m=+141.785852227 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.898575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:26 crc kubenswrapper[4841]: E1203 17:02:26.898875 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.398866963 +0000 UTC m=+141.786387690 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.901584 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.907884 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.914605 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.921209 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.928660 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-klz55" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.954166 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" Dec 03 17:02:26 crc kubenswrapper[4841]: I1203 17:02:26.960154 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vfkgp" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:26.999838 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.000050 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.500020502 +0000 UTC m=+141.887541229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.000263 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.000632 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.500622778 +0000 UTC m=+141.888143515 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.003709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.014590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.016262 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-8qgdw"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.018007 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.024946 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.035759 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.043955 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.101437 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.105218 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.605187404 +0000 UTC m=+141.992708131 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.208957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.209342 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.70930124 +0000 UTC m=+142.096821957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.311939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.315543 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.815505798 +0000 UTC m=+142.203026515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.325549 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.413385 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.413670 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:27.913659401 +0000 UTC m=+142.301180128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.416765 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.433560 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-zwx96"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.460944 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-k26kv"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.464618 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.469224 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.490936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" event={"ID":"b4f720c5-d60c-449f-8c98-8295bd17c472","Type":"ContainerStarted","Data":"a433997eed0405d7257f39fa8584c72d700f7e61eafa4eefa31c3b77e4665465"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.516180 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.516819 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.01677883 +0000 UTC m=+142.404299557 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.543870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" event={"ID":"671b18a0-95da-4c17-9ef5-4b0dc243ff4f","Type":"ContainerStarted","Data":"d7801b998df0dcb3b18ac9c852aa381fc3a11762c6386b54e642250ac1cb30b5"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.554740 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" event={"ID":"64a47326-6d95-4fea-bf1e-ef38e231e1f3","Type":"ContainerStarted","Data":"4d6315576da772ef3e0388e343208ae7120dc58b8682dd2da5300dfd1fb67756"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.567561 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" event={"ID":"876346bb-a538-4b29-a71f-6ca64d8b60f0","Type":"ContainerStarted","Data":"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.585715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.594255 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" event={"ID":"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac","Type":"ContainerStarted","Data":"3d390b1674dd63a8a1b5d25505fb7064ecc82e4dcb39800887151e55c9eb3642"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.600471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" event={"ID":"9b021c9d-ee1d-4b78-bb6f-08303c69ebee","Type":"ContainerStarted","Data":"28928ebc934e6053875f18a3367a7437401223cace69ea9f6309ee49ebbfbd23"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.602013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" event={"ID":"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7","Type":"ContainerStarted","Data":"fd5bc5c837151a31359ce3ced25a72ff967ac25522c1924808a4252903646a97"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.602167 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.608250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" event={"ID":"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52","Type":"ContainerStarted","Data":"935df823b93742d70fe91091a66f4de34cd611e92e02774507b83f7e621ebea3"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.612035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" event={"ID":"0194ebac-1cfb-42eb-910e-7622443b8d15","Type":"ContainerStarted","Data":"956920c62ff85e680658d4ba819f46955689c3edd42d6d3d84bd241bc8b78961"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.613058 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" event={"ID":"c05ce6f5-b89a-458f-ac7d-c297a18822f7","Type":"ContainerStarted","Data":"ee6edb899fc1150b1d4feab353a14fd9b90e2e818498f25c8d72dd47c6425830"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.614783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jghh5" event={"ID":"cc498551-0214-4646-bebe-1129a989142c","Type":"ContainerStarted","Data":"c32afd4da32598c485a64ea12e144abf29cf0f29a932a6d4f2af9da0d6c04ae7"} Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.615712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" event={"ID":"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1","Type":"ContainerStarted","Data":"8775f0c92982105ff47f8c8356fec597bce6f9d1a85fed08e5ffa7d4d37e02be"} Dec 03 17:02:27 crc kubenswrapper[4841]: W1203 17:02:27.617459 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a53af00_4d81_4217_a154_052396aac6dd.slice/crio-6360cd413d3f92cfeb068f3ecfc86f8400212015f38690ce34ae9ec9ea8bcb45 WatchSource:0}: Error finding container 6360cd413d3f92cfeb068f3ecfc86f8400212015f38690ce34ae9ec9ea8bcb45: Status 404 returned error can't find the container with id 6360cd413d3f92cfeb068f3ecfc86f8400212015f38690ce34ae9ec9ea8bcb45 Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.618767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.619122 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.119106699 +0000 UTC m=+142.506627426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.678051 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.683499 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mvjc8"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.720637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.720862 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.220841204 +0000 UTC m=+142.608361931 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.720996 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.721323 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.221313876 +0000 UTC m=+142.608834603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: W1203 17:02:27.746892 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92178376_5eeb_4fdd_948e_b6c7beaa25e1.slice/crio-cb02bdca37c6993e2c33c194cf40f815c1d71ae68c7438f4189f98100935dcda WatchSource:0}: Error finding container cb02bdca37c6993e2c33c194cf40f815c1d71ae68c7438f4189f98100935dcda: Status 404 returned error can't find the container with id cb02bdca37c6993e2c33c194cf40f815c1d71ae68c7438f4189f98100935dcda Dec 03 17:02:27 crc kubenswrapper[4841]: W1203 17:02:27.747181 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4261fd00_56f9_4b61_8f68_23092f89b47f.slice/crio-5430cb770326a8c54c6120b7cd50d4244f637db3e161d9510bd1bcd160c4684f WatchSource:0}: Error finding container 5430cb770326a8c54c6120b7cd50d4244f637db3e161d9510bd1bcd160c4684f: Status 404 returned error can't find the container with id 5430cb770326a8c54c6120b7cd50d4244f637db3e161d9510bd1bcd160c4684f Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.823166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.823360 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.323330397 +0000 UTC m=+142.710851124 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.823748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.824473 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.324450686 +0000 UTC m=+142.711971423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.866852 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7"] Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.924525 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.924961 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.424938878 +0000 UTC m=+142.812459605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:27 crc kubenswrapper[4841]: I1203 17:02:27.925048 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:27 crc kubenswrapper[4841]: E1203 17:02:27.925431 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.42542166 +0000 UTC m=+142.812942397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.013207 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.029787 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.030287 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.530258224 +0000 UTC m=+142.917778961 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.057965 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.117600 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tmcv7"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.122285 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.132249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.132598 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.632581543 +0000 UTC m=+143.020102260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.233238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.233380 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.733363733 +0000 UTC m=+143.120884460 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.233459 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.233742 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.733735052 +0000 UTC m=+143.121255779 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.335360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.336020 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.835987909 +0000 UTC m=+143.223508676 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.437201 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.437855 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:28.937822636 +0000 UTC m=+143.325343403 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.519434 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.519501 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.527075 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.532356 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-klz55"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.536487 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.537954 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.538585 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.038543424 +0000 UTC m=+143.426064201 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.539038 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9zpn"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.541195 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.546796 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.548781 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.553178 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-kq6kf"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.556012 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cxw5w"] Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.557944 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-25w5k"] Dec 03 17:02:28 crc kubenswrapper[4841]: W1203 17:02:28.593220 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ead505_bb1a_4256_8986_20eea1433dac.slice/crio-2c8a707c429e4ec5b135f690ae2404460883ea38651455fb3398c815d069e09c WatchSource:0}: Error finding container 2c8a707c429e4ec5b135f690ae2404460883ea38651455fb3398c815d069e09c: Status 404 returned error can't find the container with id 2c8a707c429e4ec5b135f690ae2404460883ea38651455fb3398c815d069e09c Dec 03 17:02:28 crc kubenswrapper[4841]: W1203 17:02:28.598180 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f113f14_eee2_482f_9142_feac7fb442de.slice/crio-74ef19ea212cefbafbfb37a0707b8f0d9d834725d9f1a3299f7f0d6a6f4c2aa0 WatchSource:0}: Error finding container 74ef19ea212cefbafbfb37a0707b8f0d9d834725d9f1a3299f7f0d6a6f4c2aa0: Status 404 returned error can't find the container with id 74ef19ea212cefbafbfb37a0707b8f0d9d834725d9f1a3299f7f0d6a6f4c2aa0 Dec 03 17:02:28 crc kubenswrapper[4841]: W1203 17:02:28.604739 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb966f8b3_646c_4a50_a979_a65a815947e8.slice/crio-64652b6b90996054b175404698bc411a9ed166cc1983395c53a0b502e5e56819 WatchSource:0}: Error finding container 64652b6b90996054b175404698bc411a9ed166cc1983395c53a0b502e5e56819: Status 404 returned error can't find the container with id 64652b6b90996054b175404698bc411a9ed166cc1983395c53a0b502e5e56819 Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.620914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" event={"ID":"ad231084-6053-40b1-892c-284992b5df93","Type":"ContainerStarted","Data":"98b5693d485a03c868f478fd59d7b96cf5d3cabdd3dce64d021cadda92a60390"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.622848 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" event={"ID":"92178376-5eeb-4fdd-948e-b6c7beaa25e1","Type":"ContainerStarted","Data":"cb02bdca37c6993e2c33c194cf40f815c1d71ae68c7438f4189f98100935dcda"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.624062 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" event={"ID":"34704048-bc3f-4744-a1ec-2fc6176c9e61","Type":"ContainerStarted","Data":"9d9cab752a4d67ce573a4e079cc5dd60e96cbea67475294bfd2fda4ff3a33429"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.626356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" event={"ID":"02cf73a4-150b-4a14-9e46-6e986b38304f","Type":"ContainerStarted","Data":"3377abf1c1d160d10c08f237df955f6a1e8436587539c5363a292a3a003c8097"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.630636 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" event={"ID":"9fdbc19c-851c-4465-ac8f-e86caca57814","Type":"ContainerStarted","Data":"674e5147cad5678449a840954fb1361974f86d0eb638766ab6d07f9553467731"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.636299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" event={"ID":"292c8d77-02e4-4a9a-8f4c-3cd1acd07907","Type":"ContainerStarted","Data":"c15d5b766af0955368fc7bdb94dc5b7123adb85911c2a08e6cdfe955bdf41447"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.639064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.639424 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.139346085 +0000 UTC m=+143.526866812 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.641329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tmcv7" event={"ID":"c177bc13-030a-442f-9efa-603b4620a2c8","Type":"ContainerStarted","Data":"099833b406d26d6990051f120eab92ab712612f93956d728ac6c386e44baf2aa"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.644040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" event={"ID":"ac141b85-7778-4866-8dee-ee96cbc5f6b7","Type":"ContainerStarted","Data":"5b32ad7801bb47af8440164e4a8615946d83fc7d9b34cacfd36e87dc66a58408"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.650671 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" event={"ID":"0648214d-f0e4-4a8d-8f0d-2c3751c8e369","Type":"ContainerStarted","Data":"c294bb47c11aef06babbc3dfa2cb769e394d2f7c5abed454dc5e435421a7dc3b"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.657863 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" event={"ID":"9323acaa-2ebf-4776-824e-8e532a9f3b62","Type":"ContainerStarted","Data":"b1ff17194eb3658d3937f199846224c21abc23612185832e8eef9f51e962c6cd"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.660192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" event={"ID":"b966f8b3-646c-4a50-a979-a65a815947e8","Type":"ContainerStarted","Data":"64652b6b90996054b175404698bc411a9ed166cc1983395c53a0b502e5e56819"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.662787 4841 generic.go:334] "Generic (PLEG): container finished" podID="671b18a0-95da-4c17-9ef5-4b0dc243ff4f" containerID="d7801b998df0dcb3b18ac9c852aa381fc3a11762c6386b54e642250ac1cb30b5" exitCode=0 Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.663024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" event={"ID":"671b18a0-95da-4c17-9ef5-4b0dc243ff4f","Type":"ContainerDied","Data":"d7801b998df0dcb3b18ac9c852aa381fc3a11762c6386b54e642250ac1cb30b5"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.666488 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" event={"ID":"4c07ec09-68a5-4c56-a97a-5eb0a73a020d","Type":"ContainerStarted","Data":"0befe2b47b64006746ffae050eb0c814c9f47ea5fe48dfe5aab9502bd804d860"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.668466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" event={"ID":"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c","Type":"ContainerStarted","Data":"3a668e2807e1e76df57f9d608555ee9f00141295d88b51cdcbc78c35b2ffbd2f"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.669487 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" event={"ID":"4261fd00-56f9-4b61-8f68-23092f89b47f","Type":"ContainerStarted","Data":"5430cb770326a8c54c6120b7cd50d4244f637db3e161d9510bd1bcd160c4684f"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.670606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" event={"ID":"64a47326-6d95-4fea-bf1e-ef38e231e1f3","Type":"ContainerStarted","Data":"5b100891ce4bde5137efa3e9a1fb0361f8eeff3babf1e4b707ec79bf0e7991d7"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.671742 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k26kv" event={"ID":"ecf0dd5d-5164-4c5f-a4b5-e394182adc25","Type":"ContainerStarted","Data":"06b61277208f8e290ce8c67d77ff84b07481a86c21a68dd5e224aa2fc71f51f5"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.672432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" event={"ID":"443df56d-afb1-4463-a11f-4de38331f234","Type":"ContainerStarted","Data":"6bbd0805e3f988079952dc36a8d2eed3a1aaa1b10fea5f525584c45b0182aa06"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.673151 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" event={"ID":"1f113f14-eee2-482f-9142-feac7fb442de","Type":"ContainerStarted","Data":"74ef19ea212cefbafbfb37a0707b8f0d9d834725d9f1a3299f7f0d6a6f4c2aa0"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.674426 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngr75" event={"ID":"34e90356-ed2e-4e60-9e00-97a1b62d640b","Type":"ContainerStarted","Data":"4bc73e1809e9fc2f467c7281ed0da4dae2020f1007687d06f45dadb6d25fd386"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.675522 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vfkgp" event={"ID":"0a53af00-4d81-4217-a154-052396aac6dd","Type":"ContainerStarted","Data":"6360cd413d3f92cfeb068f3ecfc86f8400212015f38690ce34ae9ec9ea8bcb45"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.677272 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" event={"ID":"bd9368a6-dc6b-42fd-9062-a01612ceb28c","Type":"ContainerStarted","Data":"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.689897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" event={"ID":"b95878a7-fa1d-4d27-a660-a80646f1c8db","Type":"ContainerStarted","Data":"88d09163b0fe790edbc441396bcd2ccc7176a48defd7a55407a8b5ab25b3f3e9"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.706171 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" event={"ID":"52ead505-bb1a-4256-8986-20eea1433dac","Type":"ContainerStarted","Data":"2c8a707c429e4ec5b135f690ae2404460883ea38651455fb3398c815d069e09c"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.715370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-klz55" event={"ID":"59a1b67a-3b53-4c3d-b352-9647a312efbb","Type":"ContainerStarted","Data":"4ba124a87521d44b643467746a42c3d425c8135df6fe3add4874fa522e4037df"} Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.715785 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.717877 4841 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-59tlz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.717962 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.734762 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" podStartSLOduration=119.734746127 podStartE2EDuration="1m59.734746127s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:28.732459988 +0000 UTC m=+143.119980715" watchObservedRunningTime="2025-12-03 17:02:28.734746127 +0000 UTC m=+143.122266854" Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.740042 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.740499 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.240473373 +0000 UTC m=+143.627994170 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.740629 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.741638 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.241620583 +0000 UTC m=+143.629141310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.852978 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.853107 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.353079996 +0000 UTC m=+143.740600733 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.853731 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.854145 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.354129942 +0000 UTC m=+143.741650669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.954377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.954809 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.454788499 +0000 UTC m=+143.842309226 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:28 crc kubenswrapper[4841]: I1203 17:02:28.955064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:28 crc kubenswrapper[4841]: E1203 17:02:28.955402 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.455386974 +0000 UTC m=+143.842907701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.056621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.056787 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.556761379 +0000 UTC m=+143.944282106 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.057193 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.057506 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.557494108 +0000 UTC m=+143.945014835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.157894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.158030 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.658007481 +0000 UTC m=+144.045528208 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.158242 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.159014 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.659000416 +0000 UTC m=+144.046521143 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.259488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.259660 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.759638622 +0000 UTC m=+144.147159349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.260197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.260532 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.760515865 +0000 UTC m=+144.148036592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.361382 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.361563 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.861537311 +0000 UTC m=+144.249058038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.361611 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.368285 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.868262563 +0000 UTC m=+144.255783290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.462489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.462944 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:29.962927096 +0000 UTC m=+144.350447823 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.564775 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.565548 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.065530472 +0000 UTC m=+144.453051199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.666202 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.666425 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.166395524 +0000 UTC m=+144.553916251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.735299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jghh5" event={"ID":"cc498551-0214-4646-bebe-1129a989142c","Type":"ContainerStarted","Data":"eebc52ec59aff15df4f219350e466daf372266cfa05424445bc9c78734a0a17b"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.738577 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" event={"ID":"b95878a7-fa1d-4d27-a660-a80646f1c8db","Type":"ContainerStarted","Data":"756354f2487f45f54ae0451595f5c26485da453a7c33256f1af0d5c315697a35"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.738846 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.741287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" event={"ID":"1a95f13d-5349-4097-8482-88efaa760147","Type":"ContainerStarted","Data":"9b00de59f825c822e63f9ca01c3a8430b3aa9ccd4f53d2e03eaaf781617c3fa7"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.746432 4841 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-k8hdg container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" start-of-body= Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.746481 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" podUID="b95878a7-fa1d-4d27-a660-a80646f1c8db" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.39:5443/healthz\": dial tcp 10.217.0.39:5443: connect: connection refused" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.747118 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k26kv" event={"ID":"ecf0dd5d-5164-4c5f-a4b5-e394182adc25","Type":"ContainerStarted","Data":"b62a19e799c7a91fb89c2085dcce69ef0b982fa5d002174984eca5db1981dc87"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.747469 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.762631 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.762708 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.764757 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.765098 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.765151 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.769836 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.770953 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.27093797 +0000 UTC m=+144.658458697 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.785418 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" event={"ID":"292c8d77-02e4-4a9a-8f4c-3cd1acd07907","Type":"ContainerStarted","Data":"abc56f6bd158130ee02838a4ceaa99ce07584e78d4d547dff2d488e82a2daf5b"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.786532 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.797483 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jghh5" podStartSLOduration=120.797457979 podStartE2EDuration="2m0.797457979s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.763277654 +0000 UTC m=+144.150798391" watchObservedRunningTime="2025-12-03 17:02:29.797457979 +0000 UTC m=+144.184978706" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.817143 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" event={"ID":"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4","Type":"ContainerStarted","Data":"c75336ecd5cd106aa5510c1b164076564399b07d98006438d884570fa75aab9a"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.817201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" event={"ID":"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4","Type":"ContainerStarted","Data":"968d10fbbccae89387fe98b644159e319ef33ef87ba832a5200bbd366d581507"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.817315 4841 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-44m78 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.817353 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" podUID="292c8d77-02e4-4a9a-8f4c-3cd1acd07907" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.829560 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" podStartSLOduration=120.82953738 podStartE2EDuration="2m0.82953738s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.828155305 +0000 UTC m=+144.215676032" watchObservedRunningTime="2025-12-03 17:02:29.82953738 +0000 UTC m=+144.217058107" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.829880 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-k26kv" podStartSLOduration=120.829872399 podStartE2EDuration="2m0.829872399s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.799292956 +0000 UTC m=+144.186813683" watchObservedRunningTime="2025-12-03 17:02:29.829872399 +0000 UTC m=+144.217393136" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.835340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" event={"ID":"4c07ec09-68a5-4c56-a97a-5eb0a73a020d","Type":"ContainerStarted","Data":"ac471b05df0261d1ad379a9979437655dd12a90cf9df2b9bea1ea37bff0057ff"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.845603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" event={"ID":"9b021c9d-ee1d-4b78-bb6f-08303c69ebee","Type":"ContainerStarted","Data":"78bdf293879e0504ab34034adbca4fa22950e665164e7a085c6731d6ef11397e"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.865597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" event={"ID":"b4f720c5-d60c-449f-8c98-8295bd17c472","Type":"ContainerStarted","Data":"4ca7c056400d0d5b236df376c5757d5f0e852fb7ac45eb99ee5c6848f3f308b1"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.869109 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" event={"ID":"0194ebac-1cfb-42eb-910e-7622443b8d15","Type":"ContainerStarted","Data":"61abd2c458f647a21ce21419fdf3ada345b44ce14e75b5d4f2fef5f02424f7f1"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.870544 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.872167 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.37213558 +0000 UTC m=+144.759656307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.875169 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" podStartSLOduration=120.875151277 podStartE2EDuration="2m0.875151277s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.850965749 +0000 UTC m=+144.238486486" watchObservedRunningTime="2025-12-03 17:02:29.875151277 +0000 UTC m=+144.262672014" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.884318 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngr75" event={"ID":"34e90356-ed2e-4e60-9e00-97a1b62d640b","Type":"ContainerStarted","Data":"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.900755 4841 generic.go:334] "Generic (PLEG): container finished" podID="02cf73a4-150b-4a14-9e46-6e986b38304f" containerID="6957a527b33f8fc3d4270d8bca989dd1bc444e4ee3828914482f186c2d6207ca" exitCode=0 Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.901086 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" event={"ID":"02cf73a4-150b-4a14-9e46-6e986b38304f","Type":"ContainerDied","Data":"6957a527b33f8fc3d4270d8bca989dd1bc444e4ee3828914482f186c2d6207ca"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.906970 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vfkgp" event={"ID":"0a53af00-4d81-4217-a154-052396aac6dd","Type":"ContainerStarted","Data":"ce29b9b18cc248cbfa1477be32ad46727df0801bdbc96c221b2cf655658e7b54"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.920033 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-66nbw" podStartSLOduration=120.920013345 podStartE2EDuration="2m0.920013345s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.913368295 +0000 UTC m=+144.300889022" watchObservedRunningTime="2025-12-03 17:02:29.920013345 +0000 UTC m=+144.307534072" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.920467 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-ng62w" podStartSLOduration=120.920460017 podStartE2EDuration="2m0.920460017s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.877162378 +0000 UTC m=+144.264683105" watchObservedRunningTime="2025-12-03 17:02:29.920460017 +0000 UTC m=+144.307980744" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.923131 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" event={"ID":"1f113f14-eee2-482f-9142-feac7fb442de","Type":"ContainerStarted","Data":"6fe8ca1d19acfb4a316d27a0e71f4fd55b050312621621a1384efd3d869e41c3"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.931778 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" event={"ID":"b966f8b3-646c-4a50-a979-a65a815947e8","Type":"ContainerStarted","Data":"181077b9bd1ebfc9a99081036d8646875d779f0a0ec26f4eae18acd356f0039b"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.933055 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-dkcrc" podStartSLOduration=120.933033698 podStartE2EDuration="2m0.933033698s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.931614262 +0000 UTC m=+144.319134989" watchObservedRunningTime="2025-12-03 17:02:29.933033698 +0000 UTC m=+144.320554425" Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.942350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" event={"ID":"34704048-bc3f-4744-a1ec-2fc6176c9e61","Type":"ContainerStarted","Data":"401fcc470f552b179fea5967c2058850005d48c6e790f1c669ee786abe113715"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.942407 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" event={"ID":"34704048-bc3f-4744-a1ec-2fc6176c9e61","Type":"ContainerStarted","Data":"0e78e73f75a465dffcb25e782fba33e861a9130cc79383bdfa56589ddeeb8212"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.952381 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" event={"ID":"ec55af94-b794-47c8-b6bd-ce686ad2f9a4","Type":"ContainerStarted","Data":"2435034571ca1aed0a5ea06f52f6a5d1e5cc1a0362a3f2d9ca7a64e93d83c397"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.952455 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" event={"ID":"ec55af94-b794-47c8-b6bd-ce686ad2f9a4","Type":"ContainerStarted","Data":"7f3472d56f6e2a54ed15cd62fff7b5dbefd9da458064fc81c5445c63e8165b57"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.967843 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" event={"ID":"f762a14e-29c3-4f5d-b051-60431de11a82","Type":"ContainerStarted","Data":"cf74a4c3f38c673b4bd57bafaa08c0e3f06d36233537cbacecaf035d2d7be7de"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.967896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" event={"ID":"f762a14e-29c3-4f5d-b051-60431de11a82","Type":"ContainerStarted","Data":"d70a2756ddb9a87f0102bf2ccf5156742cbd265aee38ef0e652da4a05e7bbece"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.974169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:29 crc kubenswrapper[4841]: E1203 17:02:29.977207 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.477192279 +0000 UTC m=+144.864713006 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.985413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" event={"ID":"4261fd00-56f9-4b61-8f68-23092f89b47f","Type":"ContainerStarted","Data":"586a412085836677b5b05b9f9e2dcc19ebf2c28eb2521e7cb4c677dd66ad93e1"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.987653 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-klz55" event={"ID":"59a1b67a-3b53-4c3d-b352-9647a312efbb","Type":"ContainerStarted","Data":"bcf01f717aa6d830dc960f6030bcb85a0881da9480ae9130c2078604d791bf53"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.990185 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" event={"ID":"9fdbc19c-851c-4465-ac8f-e86caca57814","Type":"ContainerStarted","Data":"239119774f98788c05ad42cfdd0c25af83445282ffb3644c9bb7962ddb7e9627"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.993241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" event={"ID":"0648214d-f0e4-4a8d-8f0d-2c3751c8e369","Type":"ContainerStarted","Data":"3f8e2c9167ae16077eab874a8a0c00dadcb95f3fc02d4427ebabdd11d585f04f"} Dec 03 17:02:29 crc kubenswrapper[4841]: I1203 17:02:29.994162 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.000184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" event={"ID":"92178376-5eeb-4fdd-948e-b6c7beaa25e1","Type":"ContainerStarted","Data":"f98e09a300bcb47af13335b192fef48cf111aa9ed758b8d068c7bd40b707555c"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.000861 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ngr75" podStartSLOduration=121.000843034 podStartE2EDuration="2m1.000843034s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:29.967303826 +0000 UTC m=+144.354824563" watchObservedRunningTime="2025-12-03 17:02:30.000843034 +0000 UTC m=+144.388363761" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.002220 4841 generic.go:334] "Generic (PLEG): container finished" podID="b38ec4ff-67eb-467c-97d3-efbc96c8b4d7" containerID="7da08bc6a9ad72b40cf47e9830af118fafac69ab1f01e88a4de0e2cec36f31a4" exitCode=0 Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.002271 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" event={"ID":"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7","Type":"ContainerDied","Data":"7da08bc6a9ad72b40cf47e9830af118fafac69ab1f01e88a4de0e2cec36f31a4"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.005528 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p6dg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.005701 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.017475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" event={"ID":"59aa053f-a2a4-4af0-ae8f-bf2e65ee406c","Type":"ContainerStarted","Data":"5c1d9fe83c2e47adbaa986a913fe9ee392426a8047337ad07302a853e9f4d326"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.024702 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.026493 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" event={"ID":"ac141b85-7778-4866-8dee-ee96cbc5f6b7","Type":"ContainerStarted","Data":"e0ece7317bcc0cc70da0df5d40c24e766adf56c1771afda5488da874590bb722"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.043255 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vfkgp" podStartSLOduration=8.043240759 podStartE2EDuration="8.043240759s" podCreationTimestamp="2025-12-03 17:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.041348141 +0000 UTC m=+144.428868868" watchObservedRunningTime="2025-12-03 17:02:30.043240759 +0000 UTC m=+144.430761486" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.046356 4841 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mt7kx container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.046535 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" podUID="59aa053f-a2a4-4af0-ae8f-bf2e65ee406c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.046852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" event={"ID":"a25fb107-dace-4acc-8cac-e73b2788db2f","Type":"ContainerStarted","Data":"419d858fd4c0f7ca6809fb6e95eb3953bb6781a9533db2bfa7820e45fe03b556"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.047071 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" event={"ID":"a25fb107-dace-4acc-8cac-e73b2788db2f","Type":"ContainerStarted","Data":"d3b2f8f6d81f8bd44dbfe953b2bea355abd51636487e2d5030f4be53f7abb282"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.047967 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.057834 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-kq6kf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.058145 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" podUID="a25fb107-dace-4acc-8cac-e73b2788db2f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.059224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" event={"ID":"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92","Type":"ContainerStarted","Data":"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.059372 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" event={"ID":"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92","Type":"ContainerStarted","Data":"c02000ebef136067b8fb72fcb8f57a4eee8f33528d7fd1b792d6a80dd184038a"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.060519 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.065420 4841 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v9zpn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.065719 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.076714 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.079035 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.579015845 +0000 UTC m=+144.966536572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.103494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" event={"ID":"ad231084-6053-40b1-892c-284992b5df93","Type":"ContainerStarted","Data":"61f081450db7f42930a6fbe061d97e9fc539af5dbd2e07cfd7b35b62fb9c257d"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.113873 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-xtkh4" podStartSLOduration=121.113859417 podStartE2EDuration="2m1.113859417s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.113504228 +0000 UTC m=+144.501024965" watchObservedRunningTime="2025-12-03 17:02:30.113859417 +0000 UTC m=+144.501380134" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.115293 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-7j2mh" podStartSLOduration=121.115286574 podStartE2EDuration="2m1.115286574s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.078629935 +0000 UTC m=+144.466150672" watchObservedRunningTime="2025-12-03 17:02:30.115286574 +0000 UTC m=+144.502807301" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.145727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" event={"ID":"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1","Type":"ContainerStarted","Data":"b78443236fd8b53886943860190a8b419bcdd199d8f770bbe9527ff43b9dd5f2"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.166745 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" event={"ID":"c05ce6f5-b89a-458f-ac7d-c297a18822f7","Type":"ContainerStarted","Data":"b8234e2ef6718befd277b248a94776de16113d0df6797b7683b537f0c9242c9d"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.171305 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" event={"ID":"9323acaa-2ebf-4776-824e-8e532a9f3b62","Type":"ContainerStarted","Data":"6cdd3853a655599b818089fa9fafdb641e5bbbdb70a492e4bb8b9706ce422256"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.172025 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" event={"ID":"9323acaa-2ebf-4776-824e-8e532a9f3b62","Type":"ContainerStarted","Data":"f55ba7498925b698524f7c20eadffed8220e8a339949fcd6bfb5b264d181a9d4"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.173882 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tmcv7" event={"ID":"c177bc13-030a-442f-9efa-603b4620a2c8","Type":"ContainerStarted","Data":"e3acf3eba9098ed967a97982ca3821526e2cebd917c8571558f1f2c2cfa61926"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.175966 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" event={"ID":"52ead505-bb1a-4256-8986-20eea1433dac","Type":"ContainerStarted","Data":"ad6349bbab27202dfc0e482344a7182489c817a6f27c22dec74dcb06200d15c7"} Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.175991 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.177645 4841 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mj44m container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.177675 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.178500 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.184253 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mvjc8" podStartSLOduration=121.184234078 podStartE2EDuration="2m1.184234078s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.156840687 +0000 UTC m=+144.544361414" watchObservedRunningTime="2025-12-03 17:02:30.184234078 +0000 UTC m=+144.571754805" Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.187564 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.687547183 +0000 UTC m=+145.075067990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.190643 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.223342 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-bk6pj" podStartSLOduration=122.223320239 podStartE2EDuration="2m2.223320239s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.186523927 +0000 UTC m=+144.574044654" watchObservedRunningTime="2025-12-03 17:02:30.223320239 +0000 UTC m=+144.610840966" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.279571 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" podStartSLOduration=121.279546128 podStartE2EDuration="2m1.279546128s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.226834739 +0000 UTC m=+144.614355466" watchObservedRunningTime="2025-12-03 17:02:30.279546128 +0000 UTC m=+144.667066865" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.281769 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-klz55" podStartSLOduration=8.281755255 podStartE2EDuration="8.281755255s" podCreationTimestamp="2025-12-03 17:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.273665478 +0000 UTC m=+144.661186205" watchObservedRunningTime="2025-12-03 17:02:30.281755255 +0000 UTC m=+144.669275982" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.288488 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.288738 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.788712623 +0000 UTC m=+145.176233380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.288943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.290724 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.790711924 +0000 UTC m=+145.178232651 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.329804 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qntpv" podStartSLOduration=121.329785574 podStartE2EDuration="2m1.329785574s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.328877331 +0000 UTC m=+144.716398058" watchObservedRunningTime="2025-12-03 17:02:30.329785574 +0000 UTC m=+144.717306301" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.355645 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-tmx4h" podStartSLOduration=121.355614055 podStartE2EDuration="2m1.355614055s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.353759548 +0000 UTC m=+144.741280275" watchObservedRunningTime="2025-12-03 17:02:30.355614055 +0000 UTC m=+144.743134782" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.379892 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" podStartSLOduration=121.379870546 podStartE2EDuration="2m1.379870546s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.379834885 +0000 UTC m=+144.767355612" watchObservedRunningTime="2025-12-03 17:02:30.379870546 +0000 UTC m=+144.767391273" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.390603 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.390986 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.8909679 +0000 UTC m=+145.278488627 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.405684 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-888hd" podStartSLOduration=121.405667757 podStartE2EDuration="2m1.405667757s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.404325802 +0000 UTC m=+144.791846529" watchObservedRunningTime="2025-12-03 17:02:30.405667757 +0000 UTC m=+144.793188484" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.429071 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" podStartSLOduration=122.429049415 podStartE2EDuration="2m2.429049415s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.423226556 +0000 UTC m=+144.810747283" watchObservedRunningTime="2025-12-03 17:02:30.429049415 +0000 UTC m=+144.816570142" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.493363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.493786 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:30.993771542 +0000 UTC m=+145.381292269 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.504558 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" podStartSLOduration=121.504524847 podStartE2EDuration="2m1.504524847s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.504412424 +0000 UTC m=+144.891933151" watchObservedRunningTime="2025-12-03 17:02:30.504524847 +0000 UTC m=+144.892045574" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.506542 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-cxw5w" podStartSLOduration=121.506532388 podStartE2EDuration="2m1.506532388s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.473967105 +0000 UTC m=+144.861487842" watchObservedRunningTime="2025-12-03 17:02:30.506532388 +0000 UTC m=+144.894053115" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.559971 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-2497v" podStartSLOduration=122.559948886 podStartE2EDuration="2m2.559948886s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.534773631 +0000 UTC m=+144.922294358" watchObservedRunningTime="2025-12-03 17:02:30.559948886 +0000 UTC m=+144.947469613" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.594572 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.594971 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.094952672 +0000 UTC m=+145.482473409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.621302 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" podStartSLOduration=121.621267285 podStartE2EDuration="2m1.621267285s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.593255028 +0000 UTC m=+144.980775765" watchObservedRunningTime="2025-12-03 17:02:30.621267285 +0000 UTC m=+145.008788012" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.621844 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-2v7f7" podStartSLOduration=121.62183848 podStartE2EDuration="2m1.62183848s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.621667096 +0000 UTC m=+145.009187823" watchObservedRunningTime="2025-12-03 17:02:30.62183848 +0000 UTC m=+145.009359207" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.657648 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" podStartSLOduration=121.657629446 podStartE2EDuration="2m1.657629446s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.657553584 +0000 UTC m=+145.045074311" watchObservedRunningTime="2025-12-03 17:02:30.657629446 +0000 UTC m=+145.045150173" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.696111 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.696538 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.196523652 +0000 UTC m=+145.584044389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.700644 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" podStartSLOduration=122.700625597 podStartE2EDuration="2m2.700625597s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:30.700140854 +0000 UTC m=+145.087661581" watchObservedRunningTime="2025-12-03 17:02:30.700625597 +0000 UTC m=+145.088146324" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.759698 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:30 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:30 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:30 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.759756 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.797331 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.797788 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.297772413 +0000 UTC m=+145.685293140 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:30 crc kubenswrapper[4841]: I1203 17:02:30.899068 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:30 crc kubenswrapper[4841]: E1203 17:02:30.899443 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.399427955 +0000 UTC m=+145.786948682 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.000438 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.000655 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.500624676 +0000 UTC m=+145.888145423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.000771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.001180 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.50116224 +0000 UTC m=+145.888683027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.102207 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.102437 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.602406211 +0000 UTC m=+145.989926938 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.102481 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.102842 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.602831612 +0000 UTC m=+145.990352419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.182114 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" event={"ID":"c19b6ad2-27ad-4ad0-91ed-d44eb02011ac","Type":"ContainerStarted","Data":"14d252c39b8cbb359af2a39337acf16456e474c6175509a23376702f68241511"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.183989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" event={"ID":"1f113f14-eee2-482f-9142-feac7fb442de","Type":"ContainerStarted","Data":"aef3b6c2456fcf59538a7069805160fa3880613bd1fcc10ada70dcf678ea93a6"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.186009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" event={"ID":"02cf73a4-150b-4a14-9e46-6e986b38304f","Type":"ContainerStarted","Data":"e32a67c705abeabe6cae9100a4c74c628cc15f43639c96718d7a2aceae7ae075"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.186137 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.187569 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" event={"ID":"443df56d-afb1-4463-a11f-4de38331f234","Type":"ContainerStarted","Data":"497e623b3b2119c9d03db3105b9e01e0b4a28f715b96f4cf4c89c79d02b1fa8b"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.189473 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" event={"ID":"671b18a0-95da-4c17-9ef5-4b0dc243ff4f","Type":"ContainerStarted","Data":"839ba2daae1cbf37056ebbad39496d4bb392f764bd0558f50f5070cde26d8473"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.191204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" event={"ID":"666fd1d7-2a05-4ed2-814e-4fe4c30f5e52","Type":"ContainerStarted","Data":"9da6a55092ba7f091d4e25803ad066104b0b660941079b326569e1391de634e9"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.192697 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tmcv7" event={"ID":"c177bc13-030a-442f-9efa-603b4620a2c8","Type":"ContainerStarted","Data":"0df2ffb898315d0d093dcce55229c92a04c970c1d648da629327054bc4160ac6"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.192833 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.194471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" event={"ID":"a2edd4f4-e53f-41ea-a57a-5deb5e8a67a4","Type":"ContainerStarted","Data":"b626b35c9b8677927419db75a19dd3f4e9e8b8c548a3e2fe1965d41d075e9e87"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.194596 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.196171 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-q4qff" event={"ID":"9bf1c45d-ffb9-423e-bdea-7e2d209a47d1","Type":"ContainerStarted","Data":"f12f332b84a7a78648d7009fd82177c795a695995a10811f0e69d20982429553"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.198549 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" event={"ID":"b38ec4ff-67eb-467c-97d3-efbc96c8b4d7","Type":"ContainerStarted","Data":"7af14956cf1b54e08b782d5cfb27a5b73ed54fe6b05746ee4f63f4b27e47be5d"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.200371 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" event={"ID":"9b021c9d-ee1d-4b78-bb6f-08303c69ebee","Type":"ContainerStarted","Data":"ab5651d505a40b4c8b7bdeba96e957266eb42626757874fd5a71e7db05b1da3b"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.201983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" event={"ID":"f762a14e-29c3-4f5d-b051-60431de11a82","Type":"ContainerStarted","Data":"1ba3c5f190f606430c27c33afc7b68db900c9a30c59879452e012a06407787cc"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.203268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" event={"ID":"1a95f13d-5349-4097-8482-88efaa760147","Type":"ContainerStarted","Data":"ab6c9e6a1fbae39cbc76d8affcfbb324a8b72754e5a96e58ecaae0c64658528b"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.203309 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.203434 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.703411797 +0000 UTC m=+146.090932524 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.203593 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.203951 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.70393618 +0000 UTC m=+146.091457007 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.205671 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" event={"ID":"ec55af94-b794-47c8-b6bd-ce686ad2f9a4","Type":"ContainerStarted","Data":"3180158ff9ae50784085b4e81bc7b4dd8eb576e69e5713b69e89b996bd7b2486"} Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209092 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p6dg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209098 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209133 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209161 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209575 4841 patch_prober.go:28] interesting pod/console-operator-58897d9998-kq6kf container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209619 4841 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-v9zpn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" start-of-body= Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209648 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.14:6443/healthz\": dial tcp 10.217.0.14:6443: connect: connection refused" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.209615 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" podUID="a25fb107-dace-4acc-8cac-e73b2788db2f" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.211013 4841 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-44m78 container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.211041 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" podUID="292c8d77-02e4-4a9a-8f4c-3cd1acd07907" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.219432 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.221932 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mt7kx" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.232566 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-5qpx2" podStartSLOduration=123.232544982 podStartE2EDuration="2m3.232544982s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.221338785 +0000 UTC m=+145.608859512" watchObservedRunningTime="2025-12-03 17:02:31.232544982 +0000 UTC m=+145.620065709" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.278678 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-j79s8" podStartSLOduration=122.278657733 podStartE2EDuration="2m2.278657733s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.277315848 +0000 UTC m=+145.664836575" watchObservedRunningTime="2025-12-03 17:02:31.278657733 +0000 UTC m=+145.666178460" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.306611 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.309666 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.809647796 +0000 UTC m=+146.197168543 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.412919 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tmcv7" podStartSLOduration=9.412887309 podStartE2EDuration="9.412887309s" podCreationTimestamp="2025-12-03 17:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.344844627 +0000 UTC m=+145.732365374" watchObservedRunningTime="2025-12-03 17:02:31.412887309 +0000 UTC m=+145.800408036" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.413187 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.413561 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:31.913547505 +0000 UTC m=+146.301068222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.478314 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-8qgdw" podStartSLOduration=122.478288243 podStartE2EDuration="2m2.478288243s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.412252312 +0000 UTC m=+145.799773049" watchObservedRunningTime="2025-12-03 17:02:31.478288243 +0000 UTC m=+145.865808980" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.514781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.515083 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.015069514 +0000 UTC m=+146.402590241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.519657 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-67ndq" podStartSLOduration=122.519642011 podStartE2EDuration="2m2.519642011s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.518635285 +0000 UTC m=+145.906156012" watchObservedRunningTime="2025-12-03 17:02:31.519642011 +0000 UTC m=+145.907162738" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.520607 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" podStartSLOduration=122.520601186 podStartE2EDuration="2m2.520601186s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.488059923 +0000 UTC m=+145.875580670" watchObservedRunningTime="2025-12-03 17:02:31.520601186 +0000 UTC m=+145.908121913" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.607707 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-rx2p2" podStartSLOduration=122.607688215 podStartE2EDuration="2m2.607688215s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.605267683 +0000 UTC m=+145.992788410" watchObservedRunningTime="2025-12-03 17:02:31.607688215 +0000 UTC m=+145.995208942" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.616699 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.617117 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.117103796 +0000 UTC m=+146.504624523 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.693641 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" podStartSLOduration=122.693624235 podStartE2EDuration="2m2.693624235s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.651978559 +0000 UTC m=+146.039499286" watchObservedRunningTime="2025-12-03 17:02:31.693624235 +0000 UTC m=+146.081144962" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.720227 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.720978 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.220957334 +0000 UTC m=+146.608478061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.729962 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" podStartSLOduration=122.729945754 podStartE2EDuration="2m2.729945754s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.695147234 +0000 UTC m=+146.082667961" watchObservedRunningTime="2025-12-03 17:02:31.729945754 +0000 UTC m=+146.117466481" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.764355 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:31 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:31 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:31 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.764423 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.769057 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-k8hdg" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.782354 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-2jhj8" podStartSLOduration=122.782332015 podStartE2EDuration="2m2.782332015s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.732586932 +0000 UTC m=+146.120107669" watchObservedRunningTime="2025-12-03 17:02:31.782332015 +0000 UTC m=+146.169852742" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.821841 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.822226 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.322213866 +0000 UTC m=+146.709734593 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.838700 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-7dnfb" podStartSLOduration=122.838680878 podStartE2EDuration="2m2.838680878s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:31.785018174 +0000 UTC m=+146.172538901" watchObservedRunningTime="2025-12-03 17:02:31.838680878 +0000 UTC m=+146.226201605" Dec 03 17:02:31 crc kubenswrapper[4841]: I1203 17:02:31.922438 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:31 crc kubenswrapper[4841]: E1203 17:02:31.922891 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.422873123 +0000 UTC m=+146.810393850 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.023782 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.024076 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.524065293 +0000 UTC m=+146.911586020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.125078 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.125272 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.625242203 +0000 UTC m=+147.012762940 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.125331 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.125656 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.625640123 +0000 UTC m=+147.013160850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.143365 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.144558 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.193495 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.218231 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.227322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.227762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.227881 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.227937 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn8zd\" (UniqueName: \"kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.228186 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.728158347 +0000 UTC m=+147.115679074 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.319839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" event={"ID":"671b18a0-95da-4c17-9ef5-4b0dc243ff4f","Type":"ContainerStarted","Data":"417916622f12e2a6825ec93d41947da40b9f86a0a904e7350821fa1cb3708ff2"} Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.329734 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331195 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331219 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn8zd\" (UniqueName: \"kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331710 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.331935 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.332206 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.83219327 +0000 UTC m=+147.219713997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.343405 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" event={"ID":"443df56d-afb1-4463-a11f-4de38331f234","Type":"ContainerStarted","Data":"30c0614e66b222bf80b488c543e4ffdd209fce824130dd79bb37912b116f54cc"} Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.344043 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.350691 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.351175 4841 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-p6dg7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" start-of-body= Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.351216 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.42:8080/healthz\": dial tcp 10.217.0.42:8080: connect: connection refused" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.360675 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-44m78" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.362971 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.389895 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn8zd\" (UniqueName: \"kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd\") pod \"certified-operators-dshxz\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.433364 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.433982 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.434055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flmb5\" (UniqueName: \"kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.434082 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.434378 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:32.934357405 +0000 UTC m=+147.321878132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.473172 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" podStartSLOduration=124.473150888 podStartE2EDuration="2m4.473150888s" podCreationTimestamp="2025-12-03 17:00:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:32.417324729 +0000 UTC m=+146.804845466" watchObservedRunningTime="2025-12-03 17:02:32.473150888 +0000 UTC m=+146.860671615" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.476354 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tcknq"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.477504 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.491016 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547506 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547564 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547584 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flmb5\" (UniqueName: \"kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547655 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.547685 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8db4g\" (UniqueName: \"kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.548042 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.048030315 +0000 UTC m=+147.435551042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.549126 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tcknq"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.549177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.549384 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.608829 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flmb5\" (UniqueName: \"kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5\") pod \"community-operators-nr854\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.649997 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8cl5k"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.651106 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.658450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.658751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.658853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.658897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8db4g\" (UniqueName: \"kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.659309 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.159290723 +0000 UTC m=+147.546811450 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.659734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.660056 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.675559 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cl5k"] Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.687357 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.721802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8db4g\" (UniqueName: \"kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g\") pod \"certified-operators-tcknq\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.760550 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.760667 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4fpg\" (UniqueName: \"kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.760694 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.760711 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.761024 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.261013377 +0000 UTC m=+147.648534104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.768113 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:32 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:32 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:32 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.768180 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.804221 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.874507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.874869 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.874889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.875024 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4fpg\" (UniqueName: \"kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.875334 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.375319923 +0000 UTC m=+147.762840650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.875664 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.875886 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.926824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4fpg\" (UniqueName: \"kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg\") pod \"community-operators-8cl5k\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:32 crc kubenswrapper[4841]: I1203 17:02:32.981844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:32 crc kubenswrapper[4841]: E1203 17:02:32.982230 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.482217609 +0000 UTC m=+147.869738336 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.006473 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.018263 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.087307 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.087947 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.587932365 +0000 UTC m=+147.975453092 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.193716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.194178 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.694162664 +0000 UTC m=+148.081683391 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.294423 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.294659 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.794630806 +0000 UTC m=+148.182151533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.294855 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.295187 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.79517594 +0000 UTC m=+148.182696667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.390490 4841 generic.go:334] "Generic (PLEG): container finished" podID="ad231084-6053-40b1-892c-284992b5df93" containerID="61f081450db7f42930a6fbe061d97e9fc539af5dbd2e07cfd7b35b62fb9c257d" exitCode=0 Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.390574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" event={"ID":"ad231084-6053-40b1-892c-284992b5df93","Type":"ContainerDied","Data":"61f081450db7f42930a6fbe061d97e9fc539af5dbd2e07cfd7b35b62fb9c257d"} Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.399431 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.399816 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:33.899793068 +0000 UTC m=+148.287313795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.425953 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" event={"ID":"443df56d-afb1-4463-a11f-4de38331f234","Type":"ContainerStarted","Data":"6aa733d05a03f53aaa9b90766fb7bfa490954754b7e12dbc901793b805e56f1b"} Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.426680 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:02:33 crc kubenswrapper[4841]: W1203 17:02:33.470789 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68909a3d_4731_4851_a511_0b66e05d3741.slice/crio-4f9c8101b7c5131f276091ce9a1e61182e5553d6de7de744d67117b8a6024718 WatchSource:0}: Error finding container 4f9c8101b7c5131f276091ce9a1e61182e5553d6de7de744d67117b8a6024718: Status 404 returned error can't find the container with id 4f9c8101b7c5131f276091ce9a1e61182e5553d6de7de744d67117b8a6024718 Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.507580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.509824 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.009812513 +0000 UTC m=+148.397333230 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.598106 4841 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.621679 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.622448 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.122432436 +0000 UTC m=+148.509953163 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.725769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.726171 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.226155301 +0000 UTC m=+148.613676028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.772197 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:33 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:33 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:33 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.772246 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.827382 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.827696 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.327681199 +0000 UTC m=+148.715201926 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.869180 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.928608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:33 crc kubenswrapper[4841]: E1203 17:02:33.929087 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.429071595 +0000 UTC m=+148.816592322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:33 crc kubenswrapper[4841]: I1203 17:02:33.930554 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tcknq"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.029492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:34 crc kubenswrapper[4841]: E1203 17:02:34.030175 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.530155312 +0000 UTC m=+148.917676039 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.033598 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.047180 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.051466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.063440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.133521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.133561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.133609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk5fp\" (UniqueName: \"kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.133649 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: E1203 17:02:34.133967 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.633955519 +0000 UTC m=+149.021476246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.142491 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8cl5k"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.234877 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.235040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.235110 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.235157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk5fp\" (UniqueName: \"kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: E1203 17:02:34.235465 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.735445637 +0000 UTC m=+149.122966364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.235780 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.236049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.286059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk5fp\" (UniqueName: \"kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp\") pod \"redhat-marketplace-4nlrj\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.336575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.336630 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.336659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.336689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.336723 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:34 crc kubenswrapper[4841]: E1203 17:02:34.337506 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.837488929 +0000 UTC m=+149.225009656 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-mtwkx" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.340698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.340803 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.342481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.343110 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.355172 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.364245 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.409852 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.418773 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.419038 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.440805 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:34 crc kubenswrapper[4841]: E1203 17:02:34.441319 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 17:02:34.941299356 +0000 UTC m=+149.328820103 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.445182 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9qwk"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.446693 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.463240 4841 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T17:02:33.598135364Z","Handler":null,"Name":""} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.475034 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9qwk"] Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.496333 4841 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.496370 4841 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.499045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" event={"ID":"443df56d-afb1-4463-a11f-4de38331f234","Type":"ContainerStarted","Data":"67f243e93a4e8af74227c519d3e9ccbf7f6e1d628c77762ea32b8fdc98db68d7"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.504068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerStarted","Data":"26a82e702d2a3c83881d3fff2c701595576f7485a8cfe56dd0ebf32c96699ccd"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.504123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerStarted","Data":"ab7ab9c7da70f9ede7e5edcc7f3f37d5cf9f534fb1326e4ef72235f8ef6a3a95"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.509544 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.509579 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.512970 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.542075 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.542114 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.542150 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkcv4\" (UniqueName: \"kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.542171 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.555049 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerDied","Data":"cb59915c8d4605ce62f632d28f71719c3731f6b76dd27d2578a4b8aa137fa82a"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.555284 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.555766 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-25w5k" podStartSLOduration=12.555751846 podStartE2EDuration="12.555751846s" podCreationTimestamp="2025-12-03 17:02:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:34.550049 +0000 UTC m=+148.937569727" watchObservedRunningTime="2025-12-03 17:02:34.555751846 +0000 UTC m=+148.943272573" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.557122 4841 generic.go:334] "Generic (PLEG): container finished" podID="68909a3d-4731-4851-a511-0b66e05d3741" containerID="cb59915c8d4605ce62f632d28f71719c3731f6b76dd27d2578a4b8aa137fa82a" exitCode=0 Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.557244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerStarted","Data":"4f9c8101b7c5131f276091ce9a1e61182e5553d6de7de744d67117b8a6024718"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.564517 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.590147 4841 generic.go:334] "Generic (PLEG): container finished" podID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerID="856a0dd0224f5481438d6bf0a3963be4eeef6f7754a16550d58f2565c5653b58" exitCode=0 Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.590236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerDied","Data":"856a0dd0224f5481438d6bf0a3963be4eeef6f7754a16550d58f2565c5653b58"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.590265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerStarted","Data":"f3263a888903dc65cf033b7dc1c0d9a814ef4d82b0991ecd8b3c74ceb1334560"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.612429 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerStarted","Data":"d8cd27a302b79fcac2a25b585b39a57836f256837e14aa552274b9f210162774"} Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.622221 4841 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.622284 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.643766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.643805 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.643835 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkcv4\" (UniqueName: \"kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.645506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.648116 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.694814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-mtwkx\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.695351 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkcv4\" (UniqueName: \"kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4\") pod \"redhat-marketplace-b9qwk\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.745766 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.756528 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:34 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:34 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:34 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.756585 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.764980 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.849858 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.911897 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:02:34 crc kubenswrapper[4841]: I1203 17:02:34.939224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.058202 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.157154 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume\") pod \"ad231084-6053-40b1-892c-284992b5df93\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.157464 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume\") pod \"ad231084-6053-40b1-892c-284992b5df93\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.157513 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z658g\" (UniqueName: \"kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g\") pod \"ad231084-6053-40b1-892c-284992b5df93\" (UID: \"ad231084-6053-40b1-892c-284992b5df93\") " Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.158486 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume" (OuterVolumeSpecName: "config-volume") pod "ad231084-6053-40b1-892c-284992b5df93" (UID: "ad231084-6053-40b1-892c-284992b5df93"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.160753 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g" (OuterVolumeSpecName: "kube-api-access-z658g") pod "ad231084-6053-40b1-892c-284992b5df93" (UID: "ad231084-6053-40b1-892c-284992b5df93"). InnerVolumeSpecName "kube-api-access-z658g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.162174 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ad231084-6053-40b1-892c-284992b5df93" (UID: "ad231084-6053-40b1-892c-284992b5df93"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.235881 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.258036 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9qwk"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.258468 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ad231084-6053-40b1-892c-284992b5df93-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.258491 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ad231084-6053-40b1-892c-284992b5df93-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.258505 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z658g\" (UniqueName: \"kubernetes.io/projected/ad231084-6053-40b1-892c-284992b5df93-kube-api-access-z658g\") on node \"crc\" DevicePath \"\"" Dec 03 17:02:35 crc kubenswrapper[4841]: W1203 17:02:35.282831 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-f6853b3b0e6b6ccb9e728421942e171dda370226753681628c3a91d305bfbb32 WatchSource:0}: Error finding container f6853b3b0e6b6ccb9e728421942e171dda370226753681628c3a91d305bfbb32: Status 404 returned error can't find the container with id f6853b3b0e6b6ccb9e728421942e171dda370226753681628c3a91d305bfbb32 Dec 03 17:02:35 crc kubenswrapper[4841]: W1203 17:02:35.287560 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e9e3466_59a6_466a_bb38_936fc4be6f9a.slice/crio-e8e133d78de52376345e5c84ecf67325826130e6a1ce7ea41ce2431b7edc195b WatchSource:0}: Error finding container e8e133d78de52376345e5c84ecf67325826130e6a1ce7ea41ce2431b7edc195b: Status 404 returned error can't find the container with id e8e133d78de52376345e5c84ecf67325826130e6a1ce7ea41ce2431b7edc195b Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.341378 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.419602 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:02:35 crc kubenswrapper[4841]: E1203 17:02:35.419840 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad231084-6053-40b1-892c-284992b5df93" containerName="collect-profiles" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.419860 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad231084-6053-40b1-892c-284992b5df93" containerName="collect-profiles" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.420159 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad231084-6053-40b1-892c-284992b5df93" containerName="collect-profiles" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.421160 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.434075 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.436802 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.460603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q48cg\" (UniqueName: \"kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.460674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.460735 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.522020 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.525818 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.530956 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.531342 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.535136 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.562411 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q48cg\" (UniqueName: \"kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.562465 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.562516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.562552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.562582 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.563328 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.563595 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.581834 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q48cg\" (UniqueName: \"kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg\") pod \"redhat-operators-glndx\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.617266 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerStarted","Data":"feec0fa44f50fc643834a5ecd1dbcbaae36488a498f27bfe8e9687258e53cc2a"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.618811 4841 generic.go:334] "Generic (PLEG): container finished" podID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerID="26a82e702d2a3c83881d3fff2c701595576f7485a8cfe56dd0ebf32c96699ccd" exitCode=0 Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.618884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerDied","Data":"26a82e702d2a3c83881d3fff2c701595576f7485a8cfe56dd0ebf32c96699ccd"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.619936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cf178d473e6cacfbb85c4bb9401be9956659794d5944593a1543507d35332bdb"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.621167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" event={"ID":"ad231084-6053-40b1-892c-284992b5df93","Type":"ContainerDied","Data":"98b5693d485a03c868f478fd59d7b96cf5d3cabdd3dce64d021cadda92a60390"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.621189 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98b5693d485a03c868f478fd59d7b96cf5d3cabdd3dce64d021cadda92a60390" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.621379 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.622527 4841 generic.go:334] "Generic (PLEG): container finished" podID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerID="9bbaeae09b72aaa8865298cfa1ef7ddbd14e9d52f56a66afff3f10eef2e93e89" exitCode=0 Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.622578 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerDied","Data":"9bbaeae09b72aaa8865298cfa1ef7ddbd14e9d52f56a66afff3f10eef2e93e89"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.625668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" event={"ID":"3c61a910-e9a4-4f77-a5d4-56e760ed1394","Type":"ContainerStarted","Data":"c5c2ffe93b02ec22e76079ca0477828d217d2b82e6b2c8539aa561ac532308a4"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.631858 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f6853b3b0e6b6ccb9e728421942e171dda370226753681628c3a91d305bfbb32"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.636458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"64640212b4733085ab2da9f9d4152c2ea7eae8b95bf53d852e97d75ba69a4b00"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.637577 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerStarted","Data":"e8e133d78de52376345e5c84ecf67325826130e6a1ce7ea41ce2431b7edc195b"} Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.645132 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-q28gm" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.665043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.665116 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.665478 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.691724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.754471 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:35 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:35 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:35 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.754806 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.767636 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.772686 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.823790 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.825439 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.834503 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.848984 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.868322 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.868380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s99j\" (UniqueName: \"kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.868426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.874963 4841 patch_prober.go:28] interesting pod/apiserver-76f77b778f-7v9z5 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]log ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]etcd ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/max-in-flight-filter ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/image.openshift.io-apiserver-caches ok Dec 03 17:02:35 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Dec 03 17:02:35 crc kubenswrapper[4841]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/project.openshift.io-projectcache ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Dec 03 17:02:35 crc kubenswrapper[4841]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/openshift.io-restmapperupdater ok Dec 03 17:02:35 crc kubenswrapper[4841]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 17:02:35 crc kubenswrapper[4841]: livez check failed Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.875016 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" podUID="671b18a0-95da-4c17-9ef5-4b0dc243ff4f" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.971151 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.971752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s99j\" (UniqueName: \"kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.971814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.972793 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.972897 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:35 crc kubenswrapper[4841]: I1203 17:02:35.991553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s99j\" (UniqueName: \"kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j\") pod \"redhat-operators-8pdm9\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.033301 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.100331 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 17:02:36 crc kubenswrapper[4841]: W1203 17:02:36.111450 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5220c2fa_0a5f_4b19_b2bc_a8f91e0c3885.slice/crio-08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755 WatchSource:0}: Error finding container 08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755: Status 404 returned error can't find the container with id 08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.139755 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.253171 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.340492 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:02:36 crc kubenswrapper[4841]: W1203 17:02:36.352488 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09e5fe7e_183b_431c_a431_e58aeba0e1aa.slice/crio-8528be7e01068c63587f4a63fa0ed1c726f36b4d4c02353ecb696ddc841b6356 WatchSource:0}: Error finding container 8528be7e01068c63587f4a63fa0ed1c726f36b4d4c02353ecb696ddc841b6356: Status 404 returned error can't find the container with id 8528be7e01068c63587f4a63fa0ed1c726f36b4d4c02353ecb696ddc841b6356 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.670154 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerID="d2856a91b69851f2dc5f5bc03479bbe5c6d5e779d6e045fcba55070598fd378a" exitCode=0 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.670265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerDied","Data":"d2856a91b69851f2dc5f5bc03479bbe5c6d5e779d6e045fcba55070598fd378a"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.675738 4841 generic.go:334] "Generic (PLEG): container finished" podID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerID="572ea3bb9967c3b301eb5b11b7d08ee70c33cd0f5b5aeb5929e73aa78550c1b0" exitCode=0 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.675823 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerDied","Data":"572ea3bb9967c3b301eb5b11b7d08ee70c33cd0f5b5aeb5929e73aa78550c1b0"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.675856 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerStarted","Data":"8528be7e01068c63587f4a63fa0ed1c726f36b4d4c02353ecb696ddc841b6356"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.681280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7707e953115e3525a3cde1f3867c089cea958b25d725b9bf25c80a42389851ad"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.684090 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a6820742496a8355ec838e4f15364430d2e33574dfc4802d17b5bcb8a8b6fb90"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.684354 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.687544 4841 generic.go:334] "Generic (PLEG): container finished" podID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerID="95bf333d16833b5f03b97023a7cea0639395386b974f669ae232f22bafff3153" exitCode=0 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.687632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerDied","Data":"95bf333d16833b5f03b97023a7cea0639395386b974f669ae232f22bafff3153"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.687681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerStarted","Data":"69bdebac4d4a7425247e1c67b2ebe6f9be56e9591fd8bb9ac1e657579d4a7a63"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.690087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885","Type":"ContainerStarted","Data":"0a69723942d074413879993093129fc0036732b86ae1a1c0af0d9958b75e8c35"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.690160 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885","Type":"ContainerStarted","Data":"08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.692079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" event={"ID":"3c61a910-e9a4-4f77-a5d4-56e760ed1394","Type":"ContainerStarted","Data":"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.692241 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.696708 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"182b61fb101f17c6844fca69f04e4b73e0c8d7de2e253ce254c4a41325c635e1"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.703583 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerID="98b093b496796467772e69c97d01c03e7307d386f8e7956241f9a1bc80ae68f6" exitCode=0 Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.703662 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerDied","Data":"98b093b496796467772e69c97d01c03e7307d386f8e7956241f9a1bc80ae68f6"} Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.755109 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.758776 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:36 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:36 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:36 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.758823 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.778971 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.779024 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.782398 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.782449 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.855846 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" podStartSLOduration=127.855830862 podStartE2EDuration="2m7.855830862s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:36.855509793 +0000 UTC m=+151.243030520" watchObservedRunningTime="2025-12-03 17:02:36.855830862 +0000 UTC m=+151.243351589" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.885389 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.8853656779999999 podStartE2EDuration="1.885365678s" podCreationTimestamp="2025-12-03 17:02:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:36.880036221 +0000 UTC m=+151.267556948" watchObservedRunningTime="2025-12-03 17:02:36.885365678 +0000 UTC m=+151.272886415" Dec 03 17:02:36 crc kubenswrapper[4841]: I1203 17:02:36.921670 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.015282 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-kq6kf" Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.018593 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.018689 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.020222 4841 patch_prober.go:28] interesting pod/console-f9d7485db-ngr75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.020263 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ngr75" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.722412 4841 generic.go:334] "Generic (PLEG): container finished" podID="5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" containerID="0a69723942d074413879993093129fc0036732b86ae1a1c0af0d9958b75e8c35" exitCode=0 Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.722516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885","Type":"ContainerDied","Data":"0a69723942d074413879993093129fc0036732b86ae1a1c0af0d9958b75e8c35"} Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.754521 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:37 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:37 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:37 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:37 crc kubenswrapper[4841]: I1203 17:02:37.754585 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.145542 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.146801 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.150833 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.151195 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.155047 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.216831 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.217100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.318436 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.318544 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.319028 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.337477 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.474877 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.755014 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:38 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:38 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:38 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.755354 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:38 crc kubenswrapper[4841]: I1203 17:02:38.996167 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.032010 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.128994 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir\") pod \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.129148 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access\") pod \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\" (UID: \"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885\") " Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.130357 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" (UID: "5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.134648 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" (UID: "5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.232169 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.232208 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.317691 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.317950 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.421177 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.428832 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7v9z5" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.756256 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:39 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:39 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:39 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.756307 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.756528 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.756511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885","Type":"ContainerDied","Data":"08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755"} Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.756631 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08ac1fa6e3906096c15def921f5128ff535fcf304a04c44e247035aa7a64f755" Dec 03 17:02:39 crc kubenswrapper[4841]: I1203 17:02:39.760467 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78567fd3-5f88-4b0c-b000-8bce7650e6a7","Type":"ContainerStarted","Data":"ab1751aa043747690ac75da01e01355c570144f4f4d97ac666df2982b56e1c33"} Dec 03 17:02:40 crc kubenswrapper[4841]: I1203 17:02:40.755369 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:40 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:40 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:40 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:40 crc kubenswrapper[4841]: I1203 17:02:40.755680 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:40 crc kubenswrapper[4841]: I1203 17:02:40.786620 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78567fd3-5f88-4b0c-b000-8bce7650e6a7","Type":"ContainerStarted","Data":"5c41b4aeda513f00c5f815b1ccc4906e18cca01533a610895e41517013782fe5"} Dec 03 17:02:40 crc kubenswrapper[4841]: I1203 17:02:40.804351 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.80433647 podStartE2EDuration="2.80433647s" podCreationTimestamp="2025-12-03 17:02:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:02:40.801090417 +0000 UTC m=+155.188611144" watchObservedRunningTime="2025-12-03 17:02:40.80433647 +0000 UTC m=+155.191857197" Dec 03 17:02:41 crc kubenswrapper[4841]: I1203 17:02:41.762390 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:41 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:41 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:41 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:41 crc kubenswrapper[4841]: I1203 17:02:41.762766 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:41 crc kubenswrapper[4841]: I1203 17:02:41.928085 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tmcv7" Dec 03 17:02:42 crc kubenswrapper[4841]: I1203 17:02:42.042761 4841 generic.go:334] "Generic (PLEG): container finished" podID="78567fd3-5f88-4b0c-b000-8bce7650e6a7" containerID="5c41b4aeda513f00c5f815b1ccc4906e18cca01533a610895e41517013782fe5" exitCode=0 Dec 03 17:02:42 crc kubenswrapper[4841]: I1203 17:02:42.042829 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78567fd3-5f88-4b0c-b000-8bce7650e6a7","Type":"ContainerDied","Data":"5c41b4aeda513f00c5f815b1ccc4906e18cca01533a610895e41517013782fe5"} Dec 03 17:02:42 crc kubenswrapper[4841]: I1203 17:02:42.753439 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:42 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:42 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:42 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:42 crc kubenswrapper[4841]: I1203 17:02:42.753486 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:43 crc kubenswrapper[4841]: I1203 17:02:43.755542 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:43 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:43 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:43 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:43 crc kubenswrapper[4841]: I1203 17:02:43.755924 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:44 crc kubenswrapper[4841]: I1203 17:02:44.760045 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:44 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:44 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:44 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:44 crc kubenswrapper[4841]: I1203 17:02:44.760108 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:45 crc kubenswrapper[4841]: I1203 17:02:45.753439 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:45 crc kubenswrapper[4841]: [-]has-synced failed: reason withheld Dec 03 17:02:45 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:45 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:45 crc kubenswrapper[4841]: I1203 17:02:45.753502 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.754182 4841 patch_prober.go:28] interesting pod/router-default-5444994796-jghh5 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 17:02:46 crc kubenswrapper[4841]: [+]has-synced ok Dec 03 17:02:46 crc kubenswrapper[4841]: [+]process-running ok Dec 03 17:02:46 crc kubenswrapper[4841]: healthz check failed Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.754432 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jghh5" podUID="cc498551-0214-4646-bebe-1129a989142c" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.778984 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.779043 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.779088 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:46 crc kubenswrapper[4841]: I1203 17:02:46.779184 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:47 crc kubenswrapper[4841]: I1203 17:02:47.019064 4841 patch_prober.go:28] interesting pod/console-f9d7485db-ngr75 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Dec 03 17:02:47 crc kubenswrapper[4841]: I1203 17:02:47.019118 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ngr75" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Dec 03 17:02:47 crc kubenswrapper[4841]: I1203 17:02:47.768046 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:47 crc kubenswrapper[4841]: I1203 17:02:47.770664 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jghh5" Dec 03 17:02:51 crc kubenswrapper[4841]: I1203 17:02:51.798432 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:51 crc kubenswrapper[4841]: I1203 17:02:51.803943 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99-metrics-certs\") pod \"network-metrics-daemon-fcw2m\" (UID: \"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99\") " pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:51 crc kubenswrapper[4841]: I1203 17:02:51.974881 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fcw2m" Dec 03 17:02:54 crc kubenswrapper[4841]: I1203 17:02:54.946698 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.778477 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.778833 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.778611 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.778898 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.778954 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.779489 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.779548 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.780496 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"b62a19e799c7a91fb89c2085dcce69ef0b982fa5d002174984eca5db1981dc87"} pod="openshift-console/downloads-7954f5f757-k26kv" containerMessage="Container download-server failed liveness probe, will be restarted" Dec 03 17:02:56 crc kubenswrapper[4841]: I1203 17:02:56.780678 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" containerID="cri-o://b62a19e799c7a91fb89c2085dcce69ef0b982fa5d002174984eca5db1981dc87" gracePeriod=2 Dec 03 17:02:57 crc kubenswrapper[4841]: I1203 17:02:57.022793 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:57 crc kubenswrapper[4841]: I1203 17:02:57.026791 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:02:59 crc kubenswrapper[4841]: I1203 17:02:59.544858 4841 generic.go:334] "Generic (PLEG): container finished" podID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerID="b62a19e799c7a91fb89c2085dcce69ef0b982fa5d002174984eca5db1981dc87" exitCode=0 Dec 03 17:02:59 crc kubenswrapper[4841]: I1203 17:02:59.544956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k26kv" event={"ID":"ecf0dd5d-5164-4c5f-a4b5-e394182adc25","Type":"ContainerDied","Data":"b62a19e799c7a91fb89c2085dcce69ef0b982fa5d002174984eca5db1981dc87"} Dec 03 17:03:06 crc kubenswrapper[4841]: I1203 17:03:06.779026 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:06 crc kubenswrapper[4841]: I1203 17:03:06.779548 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:06 crc kubenswrapper[4841]: I1203 17:03:06.906268 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zj64h" Dec 03 17:03:09 crc kubenswrapper[4841]: I1203 17:03:09.317257 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:03:09 crc kubenswrapper[4841]: I1203 17:03:09.317368 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.543801 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 17:03:14 crc kubenswrapper[4841]: E1203 17:03:14.544524 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" containerName="pruner" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.544540 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" containerName="pruner" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.544648 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5220c2fa-0a5f-4b19-b2bc-a8f91e0c3885" containerName="pruner" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.545135 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.555209 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.623134 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.623317 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.724408 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.724628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.724756 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.741709 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:14 crc kubenswrapper[4841]: I1203 17:03:14.875587 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:15 crc kubenswrapper[4841]: I1203 17:03:15.450013 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 17:03:16 crc kubenswrapper[4841]: I1203 17:03:16.779109 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:16 crc kubenswrapper[4841]: I1203 17:03:16.779486 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.341131 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.341885 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.356328 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.358374 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.490990 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access\") pod \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.491057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir\") pod \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\" (UID: \"78567fd3-5f88-4b0c-b000-8bce7650e6a7\") " Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.491269 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "78567fd3-5f88-4b0c-b000-8bce7650e6a7" (UID: "78567fd3-5f88-4b0c-b000-8bce7650e6a7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.491472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.491537 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.491679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.492644 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.529283 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "78567fd3-5f88-4b0c-b000-8bce7650e6a7" (UID: "78567fd3-5f88-4b0c-b000-8bce7650e6a7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663271 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663324 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663374 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663421 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/78567fd3-5f88-4b0c-b000-8bce7650e6a7-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.663455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.682643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access\") pod \"installer-9-crc\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.683744 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"78567fd3-5f88-4b0c-b000-8bce7650e6a7","Type":"ContainerDied","Data":"ab1751aa043747690ac75da01e01355c570144f4f4d97ac666df2982b56e1c33"} Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.683797 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab1751aa043747690ac75da01e01355c570144f4f4d97ac666df2982b56e1c33" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.683878 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 17:03:19 crc kubenswrapper[4841]: I1203 17:03:19.975456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:03:26 crc kubenswrapper[4841]: I1203 17:03:26.780047 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:26 crc kubenswrapper[4841]: I1203 17:03:26.780413 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:32 crc kubenswrapper[4841]: E1203 17:03:32.960960 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 17:03:32 crc kubenswrapper[4841]: E1203 17:03:32.961914 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8db4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-tcknq_openshift-marketplace(9585f22b-d1dd-499c-a9e8-37c212f22844): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:32 crc kubenswrapper[4841]: E1203 17:03:32.963104 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-tcknq" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" Dec 03 17:03:36 crc kubenswrapper[4841]: E1203 17:03:36.274073 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-tcknq" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" Dec 03 17:03:36 crc kubenswrapper[4841]: E1203 17:03:36.351503 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 17:03:36 crc kubenswrapper[4841]: E1203 17:03:36.351812 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tkcv4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-b9qwk_openshift-marketplace(0e9e3466-59a6-466a-bb38-936fc4be6f9a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:36 crc kubenswrapper[4841]: E1203 17:03:36.353624 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-b9qwk" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" Dec 03 17:03:36 crc kubenswrapper[4841]: I1203 17:03:36.778140 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:36 crc kubenswrapper[4841]: I1203 17:03:36.778203 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.323602 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.325733 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.325817 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.326554 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.326639 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014" gracePeriod=600 Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.942603 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014" exitCode=0 Dec 03 17:03:39 crc kubenswrapper[4841]: I1203 17:03:39.942646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014"} Dec 03 17:03:41 crc kubenswrapper[4841]: E1203 17:03:41.259742 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-b9qwk" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" Dec 03 17:03:41 crc kubenswrapper[4841]: E1203 17:03:41.322712 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 17:03:41 crc kubenswrapper[4841]: E1203 17:03:41.323395 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7s99j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-8pdm9_openshift-marketplace(09e5fe7e-183b-431c-a431-e58aeba0e1aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:41 crc kubenswrapper[4841]: E1203 17:03:41.325485 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-8pdm9" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.630553 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-8pdm9" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.737679 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.737850 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-flmb5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-nr854_openshift-marketplace(73b4070c-62dc-49b6-b2fe-8ae468318da3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.742075 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-nr854" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.765409 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.765821 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bk5fp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-4nlrj_openshift-marketplace(f9594e34-3c26-4c76-b090-c9d8218398a6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.767740 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-4nlrj" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.776613 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.776791 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q48cg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-glndx_openshift-marketplace(958ead90-9bd3-4c1b-b9e5-21378ecff345): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.778646 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-glndx" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.798952 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.799127 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vn8zd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-dshxz_openshift-marketplace(68909a3d-4731-4851-a511-0b66e05d3741): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.800537 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-dshxz" podUID="68909a3d-4731-4851-a511-0b66e05d3741" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.804309 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.804485 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l4fpg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8cl5k_openshift-marketplace(66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:03:42 crc kubenswrapper[4841]: E1203 17:03:42.807212 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8cl5k" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.021580 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-k26kv" event={"ID":"ecf0dd5d-5164-4c5f-a4b5-e394182adc25","Type":"ContainerStarted","Data":"de21dfc097f8b6799aff0c38c9db8d13cd2f972eed788c1f406c51222b3a73e2"} Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.021929 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.022898 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.023268 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.029037 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fcw2m"] Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.032290 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c"} Dec 03 17:03:43 crc kubenswrapper[4841]: E1203 17:03:43.034468 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-4nlrj" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" Dec 03 17:03:43 crc kubenswrapper[4841]: E1203 17:03:43.034974 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-nr854" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" Dec 03 17:03:43 crc kubenswrapper[4841]: E1203 17:03:43.035055 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8cl5k" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" Dec 03 17:03:43 crc kubenswrapper[4841]: E1203 17:03:43.035115 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-glndx" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" Dec 03 17:03:43 crc kubenswrapper[4841]: E1203 17:03:43.035161 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-dshxz" podUID="68909a3d-4731-4851-a511-0b66e05d3741" Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.126823 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 17:03:43 crc kubenswrapper[4841]: I1203 17:03:43.384721 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 17:03:43 crc kubenswrapper[4841]: W1203 17:03:43.401885 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podaf034b5d_7234_4700_998a_0dc6839344f0.slice/crio-92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a WatchSource:0}: Error finding container 92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a: Status 404 returned error can't find the container with id 92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.038885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"af034b5d-7234-4700-998a-0dc6839344f0","Type":"ContainerStarted","Data":"4f38c6b2cce1e7ea58ffd119a833718beb8413d3abbba9e6e076e03f98967eda"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.039235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"af034b5d-7234-4700-998a-0dc6839344f0","Type":"ContainerStarted","Data":"92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.043064 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5390187b-2235-4acf-9edf-385393ccaea8","Type":"ContainerStarted","Data":"02debe2bfa63944868e9352c3764ad87301cdce3cbedb08bd9d6ec5425bb68d6"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.043105 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5390187b-2235-4acf-9edf-385393ccaea8","Type":"ContainerStarted","Data":"f9342c043ff31d2d29ba3c8c1ee19e0e43801274bd5cc89e4078c908998b1c32"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.058058 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" event={"ID":"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99","Type":"ContainerStarted","Data":"d81ce597a248baea269a9357b024783f182676a6d1e17af8719f68a949dbdefa"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.058135 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" event={"ID":"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99","Type":"ContainerStarted","Data":"0535dacb262301be356695a106ad3f7a75076771f09c6c1f33c5ae1a3491f7f2"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.058161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fcw2m" event={"ID":"fc6b64fa-39c5-49e0-afeb-f9db0d1e9a99","Type":"ContainerStarted","Data":"700bc0d670834acf3b19b3ddd4ea85374a1fe41468966554799a7cf77908bdef"} Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.059596 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.059665 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.101315 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=30.101287866 podStartE2EDuration="30.101287866s" podCreationTimestamp="2025-12-03 17:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:03:44.071326476 +0000 UTC m=+218.458847203" watchObservedRunningTime="2025-12-03 17:03:44.101287866 +0000 UTC m=+218.488808633" Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.102313 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=25.102300943 podStartE2EDuration="25.102300943s" podCreationTimestamp="2025-12-03 17:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:03:44.098380919 +0000 UTC m=+218.485901646" watchObservedRunningTime="2025-12-03 17:03:44.102300943 +0000 UTC m=+218.489821710" Dec 03 17:03:44 crc kubenswrapper[4841]: I1203 17:03:44.121276 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fcw2m" podStartSLOduration=195.121252022 podStartE2EDuration="3m15.121252022s" podCreationTimestamp="2025-12-03 17:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:03:44.115574363 +0000 UTC m=+218.503095130" watchObservedRunningTime="2025-12-03 17:03:44.121252022 +0000 UTC m=+218.508772759" Dec 03 17:03:45 crc kubenswrapper[4841]: I1203 17:03:45.052557 4841 generic.go:334] "Generic (PLEG): container finished" podID="af034b5d-7234-4700-998a-0dc6839344f0" containerID="4f38c6b2cce1e7ea58ffd119a833718beb8413d3abbba9e6e076e03f98967eda" exitCode=0 Dec 03 17:03:45 crc kubenswrapper[4841]: I1203 17:03:45.052869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"af034b5d-7234-4700-998a-0dc6839344f0","Type":"ContainerDied","Data":"4f38c6b2cce1e7ea58ffd119a833718beb8413d3abbba9e6e076e03f98967eda"} Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.269470 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.453763 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir\") pod \"af034b5d-7234-4700-998a-0dc6839344f0\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.453817 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access\") pod \"af034b5d-7234-4700-998a-0dc6839344f0\" (UID: \"af034b5d-7234-4700-998a-0dc6839344f0\") " Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.454589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "af034b5d-7234-4700-998a-0dc6839344f0" (UID: "af034b5d-7234-4700-998a-0dc6839344f0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.477219 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "af034b5d-7234-4700-998a-0dc6839344f0" (UID: "af034b5d-7234-4700-998a-0dc6839344f0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.554773 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/af034b5d-7234-4700-998a-0dc6839344f0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.554808 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/af034b5d-7234-4700-998a-0dc6839344f0-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.778040 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.778099 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.791627 4841 patch_prober.go:28] interesting pod/downloads-7954f5f757-k26kv container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" start-of-body= Dec 03 17:03:46 crc kubenswrapper[4841]: I1203 17:03:46.791684 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-k26kv" podUID="ecf0dd5d-5164-4c5f-a4b5-e394182adc25" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.27:8080/\": dial tcp 10.217.0.27:8080: connect: connection refused" Dec 03 17:03:47 crc kubenswrapper[4841]: I1203 17:03:47.065806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"af034b5d-7234-4700-998a-0dc6839344f0","Type":"ContainerDied","Data":"92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a"} Dec 03 17:03:47 crc kubenswrapper[4841]: I1203 17:03:47.066169 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92fc9a10c2e74225b4321da2355a11b29271898fac04a0967dd0971dd3fbf65a" Dec 03 17:03:47 crc kubenswrapper[4841]: I1203 17:03:47.065881 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 17:03:48 crc kubenswrapper[4841]: I1203 17:03:48.763845 4841 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-zwx96 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 17:03:49 crc kubenswrapper[4841]: I1203 17:03:49.003942 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-zwx96" podUID="02cf73a4-150b-4a14-9e46-6e986b38304f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 17:03:51 crc kubenswrapper[4841]: I1203 17:03:51.086587 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerStarted","Data":"3ddf91f1b753dc421b58699a90ab02a55e284a475d8ec763381a571eab5d2eef"} Dec 03 17:03:56 crc kubenswrapper[4841]: I1203 17:03:56.211771 4841 generic.go:334] "Generic (PLEG): container finished" podID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerID="3ddf91f1b753dc421b58699a90ab02a55e284a475d8ec763381a571eab5d2eef" exitCode=0 Dec 03 17:03:56 crc kubenswrapper[4841]: I1203 17:03:56.211865 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerDied","Data":"3ddf91f1b753dc421b58699a90ab02a55e284a475d8ec763381a571eab5d2eef"} Dec 03 17:03:56 crc kubenswrapper[4841]: I1203 17:03:56.791943 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-k26kv" Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.254115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerStarted","Data":"95720d968d815be533c1d2adb00983c1f208667747a13ace48dc9cec9fffe305"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.256836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerStarted","Data":"bb428106c4a2a9973d08cf70ac3738908c6d8a54c375731eb45f0a2fd8adab84"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.258231 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerStarted","Data":"ab650f83cbf0e0eb180a8e8a63165c5ced7ed82102d06334445585569117035a"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.261261 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerStarted","Data":"4d0fcf839ad854fa274a34f5581c87ec9ec5c9d0f67c6f963124e321f9e3d96f"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.263553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerStarted","Data":"ef7d8b3c1abf9c4671362e9227ae208d9858fd61485832aaf39ed833414b8073"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.267923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerStarted","Data":"968362143ac535f6e0533c678681c1322180da64a783042625b304ccd864a9d4"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.269453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerStarted","Data":"115887d5dee0eb158c5be9470bbbca73271555f6219dfd5a29b9473f24a8e10a"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.271145 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerStarted","Data":"351945b9e578ccdfaa9fe7ac76a60a905b539dd2487be7e25915fa87e9faa82f"} Dec 03 17:04:04 crc kubenswrapper[4841]: I1203 17:04:04.369374 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tcknq" podStartSLOduration=3.207241592 podStartE2EDuration="1m32.369354487s" podCreationTimestamp="2025-12-03 17:02:32 +0000 UTC" firstStartedPulling="2025-12-03 17:02:34.60667831 +0000 UTC m=+148.994199037" lastFinishedPulling="2025-12-03 17:04:03.768791205 +0000 UTC m=+238.156311932" observedRunningTime="2025-12-03 17:04:04.365758722 +0000 UTC m=+238.753279449" watchObservedRunningTime="2025-12-03 17:04:04.369354487 +0000 UTC m=+238.756875204" Dec 03 17:04:05 crc kubenswrapper[4841]: I1203 17:04:05.277503 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerID="351945b9e578ccdfaa9fe7ac76a60a905b539dd2487be7e25915fa87e9faa82f" exitCode=0 Dec 03 17:04:05 crc kubenswrapper[4841]: I1203 17:04:05.277572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerDied","Data":"351945b9e578ccdfaa9fe7ac76a60a905b539dd2487be7e25915fa87e9faa82f"} Dec 03 17:04:05 crc kubenswrapper[4841]: I1203 17:04:05.282440 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerID="ef7d8b3c1abf9c4671362e9227ae208d9858fd61485832aaf39ed833414b8073" exitCode=0 Dec 03 17:04:05 crc kubenswrapper[4841]: I1203 17:04:05.282475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerDied","Data":"ef7d8b3c1abf9c4671362e9227ae208d9858fd61485832aaf39ed833414b8073"} Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.294927 4841 generic.go:334] "Generic (PLEG): container finished" podID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerID="968362143ac535f6e0533c678681c1322180da64a783042625b304ccd864a9d4" exitCode=0 Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.295009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerDied","Data":"968362143ac535f6e0533c678681c1322180da64a783042625b304ccd864a9d4"} Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.299953 4841 generic.go:334] "Generic (PLEG): container finished" podID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerID="95720d968d815be533c1d2adb00983c1f208667747a13ace48dc9cec9fffe305" exitCode=0 Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.300036 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerDied","Data":"95720d968d815be533c1d2adb00983c1f208667747a13ace48dc9cec9fffe305"} Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.302526 4841 generic.go:334] "Generic (PLEG): container finished" podID="68909a3d-4731-4851-a511-0b66e05d3741" containerID="ab650f83cbf0e0eb180a8e8a63165c5ced7ed82102d06334445585569117035a" exitCode=0 Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.302593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerDied","Data":"ab650f83cbf0e0eb180a8e8a63165c5ced7ed82102d06334445585569117035a"} Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.306727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerStarted","Data":"fd17b574103aec78139c99e2764c08850cbc24908ac4390934d0159afe9830b7"} Dec 03 17:04:06 crc kubenswrapper[4841]: I1203 17:04:06.376587 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9qwk" podStartSLOduration=3.0078994 podStartE2EDuration="1m32.376571492s" podCreationTimestamp="2025-12-03 17:02:34 +0000 UTC" firstStartedPulling="2025-12-03 17:02:36.673448433 +0000 UTC m=+151.060969160" lastFinishedPulling="2025-12-03 17:04:06.042120525 +0000 UTC m=+240.429641252" observedRunningTime="2025-12-03 17:04:06.374936379 +0000 UTC m=+240.762457106" watchObservedRunningTime="2025-12-03 17:04:06.376571492 +0000 UTC m=+240.764092219" Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.314676 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerStarted","Data":"30958f4cd329db5ab4713fa275cbdaa5a247e2af6710d7b7cb275b2eaa8f3050"} Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.317150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerStarted","Data":"ff00c6f334f026ce9d777ebc4c69728dbcd264c03f8734e4abd9a21ad0ed7d28"} Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.318956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerStarted","Data":"30053b46b931bffeafe96deecae46e5de08c71ee7e3263f0079d4b70fb803263"} Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.320486 4841 generic.go:334] "Generic (PLEG): container finished" podID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerID="115887d5dee0eb158c5be9470bbbca73271555f6219dfd5a29b9473f24a8e10a" exitCode=0 Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.320519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerDied","Data":"115887d5dee0eb158c5be9470bbbca73271555f6219dfd5a29b9473f24a8e10a"} Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.323332 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerStarted","Data":"97c27b5a771ebdb220ec75ef919975bee194d96e69b2b8c2ae5ede7432414b33"} Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.344417 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-nr854" podStartSLOduration=3.005226198 podStartE2EDuration="1m35.344398955s" podCreationTimestamp="2025-12-03 17:02:32 +0000 UTC" firstStartedPulling="2025-12-03 17:02:34.512604992 +0000 UTC m=+148.900125719" lastFinishedPulling="2025-12-03 17:04:06.851777749 +0000 UTC m=+241.239298476" observedRunningTime="2025-12-03 17:04:07.342832144 +0000 UTC m=+241.730352871" watchObservedRunningTime="2025-12-03 17:04:07.344398955 +0000 UTC m=+241.731919682" Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.371269 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dshxz" podStartSLOduration=3.2095922359999998 podStartE2EDuration="1m35.371248243s" podCreationTimestamp="2025-12-03 17:02:32 +0000 UTC" firstStartedPulling="2025-12-03 17:02:34.590854615 +0000 UTC m=+148.978375342" lastFinishedPulling="2025-12-03 17:04:06.752510622 +0000 UTC m=+241.140031349" observedRunningTime="2025-12-03 17:04:07.367436153 +0000 UTC m=+241.754956890" watchObservedRunningTime="2025-12-03 17:04:07.371248243 +0000 UTC m=+241.758768970" Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.389763 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8cl5k" podStartSLOduration=4.086758499 podStartE2EDuration="1m35.389744141s" podCreationTimestamp="2025-12-03 17:02:32 +0000 UTC" firstStartedPulling="2025-12-03 17:02:35.625268083 +0000 UTC m=+150.012788840" lastFinishedPulling="2025-12-03 17:04:06.928253755 +0000 UTC m=+241.315774482" observedRunningTime="2025-12-03 17:04:07.385219321 +0000 UTC m=+241.772740048" watchObservedRunningTime="2025-12-03 17:04:07.389744141 +0000 UTC m=+241.777264868" Dec 03 17:04:07 crc kubenswrapper[4841]: I1203 17:04:07.422763 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4nlrj" podStartSLOduration=4.007360423 podStartE2EDuration="1m33.422746361s" podCreationTimestamp="2025-12-03 17:02:34 +0000 UTC" firstStartedPulling="2025-12-03 17:02:36.704873547 +0000 UTC m=+151.092394274" lastFinishedPulling="2025-12-03 17:04:06.120259485 +0000 UTC m=+240.507780212" observedRunningTime="2025-12-03 17:04:07.421761365 +0000 UTC m=+241.809282092" watchObservedRunningTime="2025-12-03 17:04:07.422746361 +0000 UTC m=+241.810267088" Dec 03 17:04:08 crc kubenswrapper[4841]: I1203 17:04:08.330417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerStarted","Data":"5d79d13e767f4c0046bfaceb777c6e578a636364aa802ffe08e1095e8295ecd7"} Dec 03 17:04:08 crc kubenswrapper[4841]: I1203 17:04:08.332437 4841 generic.go:334] "Generic (PLEG): container finished" podID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerID="bb428106c4a2a9973d08cf70ac3738908c6d8a54c375731eb45f0a2fd8adab84" exitCode=0 Dec 03 17:04:08 crc kubenswrapper[4841]: I1203 17:04:08.332482 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerDied","Data":"bb428106c4a2a9973d08cf70ac3738908c6d8a54c375731eb45f0a2fd8adab84"} Dec 03 17:04:08 crc kubenswrapper[4841]: I1203 17:04:08.351601 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8pdm9" podStartSLOduration=2.31791833 podStartE2EDuration="1m33.351583837s" podCreationTimestamp="2025-12-03 17:02:35 +0000 UTC" firstStartedPulling="2025-12-03 17:02:36.67684574 +0000 UTC m=+151.064366467" lastFinishedPulling="2025-12-03 17:04:07.710511247 +0000 UTC m=+242.098031974" observedRunningTime="2025-12-03 17:04:08.348641109 +0000 UTC m=+242.736161836" watchObservedRunningTime="2025-12-03 17:04:08.351583837 +0000 UTC m=+242.739104564" Dec 03 17:04:10 crc kubenswrapper[4841]: I1203 17:04:10.343727 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerStarted","Data":"c94dcdd6c11ee6a14ed9505b6a249e47937a04dab5188ebdff788d2b54377bcc"} Dec 03 17:04:11 crc kubenswrapper[4841]: I1203 17:04:11.371553 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-glndx" podStartSLOduration=4.191721857 podStartE2EDuration="1m36.371526427s" podCreationTimestamp="2025-12-03 17:02:35 +0000 UTC" firstStartedPulling="2025-12-03 17:02:36.689367791 +0000 UTC m=+151.076888518" lastFinishedPulling="2025-12-03 17:04:08.869172361 +0000 UTC m=+243.256693088" observedRunningTime="2025-12-03 17:04:11.368495277 +0000 UTC m=+245.756015994" watchObservedRunningTime="2025-12-03 17:04:11.371526427 +0000 UTC m=+245.759047174" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.491754 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.492264 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.688545 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.688616 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.805467 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.805536 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.820730 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.822945 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:04:12 crc kubenswrapper[4841]: I1203 17:04:12.845709 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.007556 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.007620 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.054018 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.409207 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.410094 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.410282 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:04:13 crc kubenswrapper[4841]: I1203 17:04:13.422804 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.411421 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.411478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.466178 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.850950 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.851035 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.898379 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:14 crc kubenswrapper[4841]: I1203 17:04:14.913033 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9zpn"] Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.108009 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tcknq"] Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.369983 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tcknq" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="registry-server" containerID="cri-o://4d0fcf839ad854fa274a34f5581c87ec9ec5c9d0f67c6f963124e321f9e3d96f" gracePeriod=2 Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.419969 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.426738 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.773454 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.773857 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:04:15 crc kubenswrapper[4841]: I1203 17:04:15.838068 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.140108 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.140193 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.182345 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.440169 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.510801 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8cl5k"] Dec 03 17:04:16 crc kubenswrapper[4841]: I1203 17:04:16.511170 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8cl5k" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="registry-server" containerID="cri-o://30053b46b931bffeafe96deecae46e5de08c71ee7e3263f0079d4b70fb803263" gracePeriod=2 Dec 03 17:04:17 crc kubenswrapper[4841]: I1203 17:04:17.432354 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-8pdm9" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="registry-server" probeResult="failure" output=< Dec 03 17:04:17 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:04:17 crc kubenswrapper[4841]: > Dec 03 17:04:17 crc kubenswrapper[4841]: I1203 17:04:17.514577 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9qwk"] Dec 03 17:04:18 crc kubenswrapper[4841]: I1203 17:04:18.388991 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9qwk" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="registry-server" containerID="cri-o://fd17b574103aec78139c99e2764c08850cbc24908ac4390934d0159afe9830b7" gracePeriod=2 Dec 03 17:04:19 crc kubenswrapper[4841]: I1203 17:04:19.395362 4841 generic.go:334] "Generic (PLEG): container finished" podID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerID="4d0fcf839ad854fa274a34f5581c87ec9ec5c9d0f67c6f963124e321f9e3d96f" exitCode=0 Dec 03 17:04:19 crc kubenswrapper[4841]: I1203 17:04:19.395411 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerDied","Data":"4d0fcf839ad854fa274a34f5581c87ec9ec5c9d0f67c6f963124e321f9e3d96f"} Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.407135 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8cl5k_66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16/registry-server/0.log" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.408206 4841 generic.go:334] "Generic (PLEG): container finished" podID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerID="30053b46b931bffeafe96deecae46e5de08c71ee7e3263f0079d4b70fb803263" exitCode=137 Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.408258 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerDied","Data":"30053b46b931bffeafe96deecae46e5de08c71ee7e3263f0079d4b70fb803263"} Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.410309 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerID="fd17b574103aec78139c99e2764c08850cbc24908ac4390934d0159afe9830b7" exitCode=0 Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.410334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerDied","Data":"fd17b574103aec78139c99e2764c08850cbc24908ac4390934d0159afe9830b7"} Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.552029 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.671789 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8db4g\" (UniqueName: \"kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g\") pod \"9585f22b-d1dd-499c-a9e8-37c212f22844\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.671880 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities\") pod \"9585f22b-d1dd-499c-a9e8-37c212f22844\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.671930 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content\") pod \"9585f22b-d1dd-499c-a9e8-37c212f22844\" (UID: \"9585f22b-d1dd-499c-a9e8-37c212f22844\") " Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.672923 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities" (OuterVolumeSpecName: "utilities") pod "9585f22b-d1dd-499c-a9e8-37c212f22844" (UID: "9585f22b-d1dd-499c-a9e8-37c212f22844"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.679481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g" (OuterVolumeSpecName: "kube-api-access-8db4g") pod "9585f22b-d1dd-499c-a9e8-37c212f22844" (UID: "9585f22b-d1dd-499c-a9e8-37c212f22844"). InnerVolumeSpecName "kube-api-access-8db4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.715722 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9585f22b-d1dd-499c-a9e8-37c212f22844" (UID: "9585f22b-d1dd-499c-a9e8-37c212f22844"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.773215 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8db4g\" (UniqueName: \"kubernetes.io/projected/9585f22b-d1dd-499c-a9e8-37c212f22844-kube-api-access-8db4g\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.773261 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:20 crc kubenswrapper[4841]: I1203 17:04:20.773275 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9585f22b-d1dd-499c-a9e8-37c212f22844-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.040397 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.040667 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="registry-server" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.040988 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="registry-server" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.041007 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78567fd3-5f88-4b0c-b000-8bce7650e6a7" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041016 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="78567fd3-5f88-4b0c-b000-8bce7650e6a7" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.041034 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="extract-utilities" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041042 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="extract-utilities" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.041050 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="extract-content" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041057 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="extract-content" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.041070 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af034b5d-7234-4700-998a-0dc6839344f0" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041077 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="af034b5d-7234-4700-998a-0dc6839344f0" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041190 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" containerName="registry-server" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041203 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="af034b5d-7234-4700-998a-0dc6839344f0" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041211 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="78567fd3-5f88-4b0c-b000-8bce7650e6a7" containerName="pruner" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041680 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041803 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041898 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783" gracePeriod=15 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041946 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1" gracePeriod=15 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041921 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd" gracePeriod=15 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.041996 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2" gracePeriod=15 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.042015 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550" gracePeriod=15 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.042690 4841 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043014 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043031 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043051 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043059 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043070 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043077 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043086 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043093 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043108 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043115 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043126 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043133 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.043142 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043149 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043280 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043292 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043305 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043313 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043321 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.043332 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.068269 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178333 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178409 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178509 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178542 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178582 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178609 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.178646 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279486 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279529 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279545 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279591 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279629 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279709 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279719 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279722 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.279787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.365260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:04:21 crc kubenswrapper[4841]: W1203 17:04:21.384919 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-9404437fcde3b6ddc31121f652753a35e27ae989c13cda45c02f64cefdebd157 WatchSource:0}: Error finding container 9404437fcde3b6ddc31121f652753a35e27ae989c13cda45c02f64cefdebd157: Status 404 returned error can't find the container with id 9404437fcde3b6ddc31121f652753a35e27ae989c13cda45c02f64cefdebd157 Dec 03 17:04:21 crc kubenswrapper[4841]: E1203 17:04:21.387445 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dc3639029d40a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 17:04:21.38694145 +0000 UTC m=+255.774462177,LastTimestamp:2025-12-03 17:04:21.38694145 +0000 UTC m=+255.774462177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.416749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.418006 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.418874 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550" exitCode=2 Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.420884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tcknq" event={"ID":"9585f22b-d1dd-499c-a9e8-37c212f22844","Type":"ContainerDied","Data":"f3263a888903dc65cf033b7dc1c0d9a814ef4d82b0991ecd8b3c74ceb1334560"} Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.420926 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tcknq" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.420952 4841 scope.go:117] "RemoveContainer" containerID="4d0fcf839ad854fa274a34f5581c87ec9ec5c9d0f67c6f963124e321f9e3d96f" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.421534 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.421697 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.421922 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.422175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"9404437fcde3b6ddc31121f652753a35e27ae989c13cda45c02f64cefdebd157"} Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.433781 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.434312 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.434711 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.436868 4841 scope.go:117] "RemoveContainer" containerID="3ddf91f1b753dc421b58699a90ab02a55e284a475d8ec763381a571eab5d2eef" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.467598 4841 scope.go:117] "RemoveContainer" containerID="856a0dd0224f5481438d6bf0a3963be4eeef6f7754a16550d58f2565c5653b58" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.756876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.758585 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.758872 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.759148 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.759413 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.797018 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8cl5k_66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16/registry-server/0.log" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.797936 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.798824 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.799319 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.799846 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.800467 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.800820 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities\") pod \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888260 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4fpg\" (UniqueName: \"kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg\") pod \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888304 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkcv4\" (UniqueName: \"kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4\") pod \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888388 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content\") pod \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888427 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities\") pod \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\" (UID: \"0e9e3466-59a6-466a-bb38-936fc4be6f9a\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.888484 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content\") pod \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\" (UID: \"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16\") " Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.889536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities" (OuterVolumeSpecName: "utilities") pod "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" (UID: "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.889812 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities" (OuterVolumeSpecName: "utilities") pod "0e9e3466-59a6-466a-bb38-936fc4be6f9a" (UID: "0e9e3466-59a6-466a-bb38-936fc4be6f9a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.894887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg" (OuterVolumeSpecName: "kube-api-access-l4fpg") pod "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" (UID: "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16"). InnerVolumeSpecName "kube-api-access-l4fpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.895076 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4" (OuterVolumeSpecName: "kube-api-access-tkcv4") pod "0e9e3466-59a6-466a-bb38-936fc4be6f9a" (UID: "0e9e3466-59a6-466a-bb38-936fc4be6f9a"). InnerVolumeSpecName "kube-api-access-tkcv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.908363 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e9e3466-59a6-466a-bb38-936fc4be6f9a" (UID: "0e9e3466-59a6-466a-bb38-936fc4be6f9a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.938642 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" (UID: "66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990615 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4fpg\" (UniqueName: \"kubernetes.io/projected/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-kube-api-access-l4fpg\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990684 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tkcv4\" (UniqueName: \"kubernetes.io/projected/0e9e3466-59a6-466a-bb38-936fc4be6f9a-kube-api-access-tkcv4\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990699 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990730 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e9e3466-59a6-466a-bb38-936fc4be6f9a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990744 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:21 crc kubenswrapper[4841]: I1203 17:04:21.990756 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.430893 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-8cl5k_66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16/registry-server/0.log" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.431686 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8cl5k" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.431678 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8cl5k" event={"ID":"66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16","Type":"ContainerDied","Data":"d8cd27a302b79fcac2a25b585b39a57836f256837e14aa552274b9f210162774"} Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.431830 4841 scope.go:117] "RemoveContainer" containerID="30053b46b931bffeafe96deecae46e5de08c71ee7e3263f0079d4b70fb803263" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.432870 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.433247 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.433660 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.433886 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.435840 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.436539 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.437145 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.437170 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.437429 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.437678 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.438040 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2" exitCode=0 Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.441415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9qwk" event={"ID":"0e9e3466-59a6-466a-bb38-936fc4be6f9a","Type":"ContainerDied","Data":"e8e133d78de52376345e5c84ecf67325826130e6a1ce7ea41ce2431b7edc195b"} Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.441519 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9qwk" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.442507 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.442754 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.443015 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.443223 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.444691 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.445556 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.445842 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.446073 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.451757 4841 scope.go:117] "RemoveContainer" containerID="968362143ac535f6e0533c678681c1322180da64a783042625b304ccd864a9d4" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.469256 4841 scope.go:117] "RemoveContainer" containerID="9bbaeae09b72aaa8865298cfa1ef7ddbd14e9d52f56a66afff3f10eef2e93e89" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.483821 4841 scope.go:117] "RemoveContainer" containerID="fd17b574103aec78139c99e2764c08850cbc24908ac4390934d0159afe9830b7" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.502891 4841 scope.go:117] "RemoveContainer" containerID="ef7d8b3c1abf9c4671362e9227ae208d9858fd61485832aaf39ed833414b8073" Dec 03 17:04:22 crc kubenswrapper[4841]: I1203 17:04:22.549351 4841 scope.go:117] "RemoveContainer" containerID="d2856a91b69851f2dc5f5bc03479bbe5c6d5e779d6e045fcba55070598fd378a" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.452173 4841 generic.go:334] "Generic (PLEG): container finished" podID="5390187b-2235-4acf-9edf-385393ccaea8" containerID="02debe2bfa63944868e9352c3764ad87301cdce3cbedb08bd9d6ec5425bb68d6" exitCode=0 Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.452263 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5390187b-2235-4acf-9edf-385393ccaea8","Type":"ContainerDied","Data":"02debe2bfa63944868e9352c3764ad87301cdce3cbedb08bd9d6ec5425bb68d6"} Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.453115 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.455168 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.456286 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.456926 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd" exitCode=0 Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.456952 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1" exitCode=0 Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.456982 4841 scope.go:117] "RemoveContainer" containerID="51f4ca4a299183b4787d9ab33e86db8fadece63b5db22c8878053ed8f3353c76" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.458256 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.458533 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.458729 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:23 crc kubenswrapper[4841]: I1203 17:04:23.458979 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.048878 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.050047 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.050539 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.050819 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.051288 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.051514 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.051956 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.052202 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117462 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117554 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117639 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117564 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117788 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117952 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117971 4841 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.117986 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.244958 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.478531 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.481052 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783" exitCode=0 Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.481127 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.481165 4841 scope.go:117] "RemoveContainer" containerID="c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.481794 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.482023 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.482230 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.482521 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.482856 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.482877 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"fd94ff6354fbf6166fe8da7b0a310cbf650c0a4564368cebeaf0d52872d1237d"} Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.483167 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.484195 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.484473 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.485246 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.485659 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.486176 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.486532 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.486893 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.487240 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.487750 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.488130 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.488477 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.488772 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.498313 4841 scope.go:117] "RemoveContainer" containerID="2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.522490 4841 scope.go:117] "RemoveContainer" containerID="c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.537668 4841 scope.go:117] "RemoveContainer" containerID="88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.554721 4841 scope.go:117] "RemoveContainer" containerID="8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.571246 4841 scope.go:117] "RemoveContainer" containerID="75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.587031 4841 scope.go:117] "RemoveContainer" containerID="c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.587707 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\": container with ID starting with c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd not found: ID does not exist" containerID="c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.587734 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd"} err="failed to get container status \"c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\": rpc error: code = NotFound desc = could not find container \"c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd\": container with ID starting with c510fb6fe15f69e927a28b8ae40e5072b926bbf8144e1c5450f76342e48741dd not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.587754 4841 scope.go:117] "RemoveContainer" containerID="2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.588125 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\": container with ID starting with 2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2 not found: ID does not exist" containerID="2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.588158 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2"} err="failed to get container status \"2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\": rpc error: code = NotFound desc = could not find container \"2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2\": container with ID starting with 2fb2d5063b4c78cb15a151775b3c8f7e3a95cb0a55baef25203f8977fc7c92f2 not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.588186 4841 scope.go:117] "RemoveContainer" containerID="c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.588939 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\": container with ID starting with c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1 not found: ID does not exist" containerID="c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.588962 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1"} err="failed to get container status \"c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\": rpc error: code = NotFound desc = could not find container \"c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1\": container with ID starting with c9951048e7124adaa48e25a842460b5865d41efce43d0d2dcc58348dc1f6f3f1 not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.588976 4841 scope.go:117] "RemoveContainer" containerID="88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.589254 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\": container with ID starting with 88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550 not found: ID does not exist" containerID="88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.589269 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550"} err="failed to get container status \"88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\": rpc error: code = NotFound desc = could not find container \"88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550\": container with ID starting with 88746a2097035bb24fa6df7dc03905cc34b1724a742d43b747ae8312b0d62550 not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.589280 4841 scope.go:117] "RemoveContainer" containerID="8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.589611 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\": container with ID starting with 8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783 not found: ID does not exist" containerID="8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.589629 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783"} err="failed to get container status \"8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\": rpc error: code = NotFound desc = could not find container \"8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783\": container with ID starting with 8db3ac1c68e685f00f0fe5b40fcef34fc5db3438fe310ceb41fa1a786338c783 not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.589642 4841 scope.go:117] "RemoveContainer" containerID="75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2" Dec 03 17:04:24 crc kubenswrapper[4841]: E1203 17:04:24.589811 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\": container with ID starting with 75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2 not found: ID does not exist" containerID="75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.589824 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2"} err="failed to get container status \"75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\": rpc error: code = NotFound desc = could not find container \"75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2\": container with ID starting with 75ba1a499cc23ae70b262dbc7242a17cf6eaa8f939b7183fdefecfd551ca32a2 not found: ID does not exist" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.740847 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.741525 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.741815 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.743614 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.746107 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.746436 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.747591 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830336 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access\") pod \"5390187b-2235-4acf-9edf-385393ccaea8\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830407 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir\") pod \"5390187b-2235-4acf-9edf-385393ccaea8\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock\") pod \"5390187b-2235-4acf-9edf-385393ccaea8\" (UID: \"5390187b-2235-4acf-9edf-385393ccaea8\") " Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5390187b-2235-4acf-9edf-385393ccaea8" (UID: "5390187b-2235-4acf-9edf-385393ccaea8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830609 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock" (OuterVolumeSpecName: "var-lock") pod "5390187b-2235-4acf-9edf-385393ccaea8" (UID: "5390187b-2235-4acf-9edf-385393ccaea8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830741 4841 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.830759 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5390187b-2235-4acf-9edf-385393ccaea8-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.836327 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5390187b-2235-4acf-9edf-385393ccaea8" (UID: "5390187b-2235-4acf-9edf-385393ccaea8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:04:24 crc kubenswrapper[4841]: I1203 17:04:24.931922 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5390187b-2235-4acf-9edf-385393ccaea8-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.492280 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.492272 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"5390187b-2235-4acf-9edf-385393ccaea8","Type":"ContainerDied","Data":"f9342c043ff31d2d29ba3c8c1ee19e0e43801274bd5cc89e4078c908998b1c32"} Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.492466 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9342c043ff31d2d29ba3c8c1ee19e0e43801274bd5cc89e4078c908998b1c32" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.516484 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.516801 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.517074 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.517313 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.517563 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:25 crc kubenswrapper[4841]: I1203 17:04:25.517826 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.183364 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.184405 4841 status_manager.go:851] "Failed to get status for pod" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" pod="openshift-marketplace/redhat-operators-8pdm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8pdm9\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.184786 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.185155 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.185633 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.185856 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.186184 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.186562 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.240637 4841 status_manager.go:851] "Failed to get status for pod" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" pod="openshift-marketplace/redhat-operators-8pdm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8pdm9\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.240845 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.241264 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.241480 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.241694 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.241958 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:26 crc kubenswrapper[4841]: I1203 17:04:26.242236 4841 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.128953 4841 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.151:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187dc3639029d40a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 17:04:21.38694145 +0000 UTC m=+255.774462177,LastTimestamp:2025-12-03 17:04:21.38694145 +0000 UTC m=+255.774462177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.820598 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.821085 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.821343 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.821585 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.822533 4841 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:29 crc kubenswrapper[4841]: I1203 17:04:29.822591 4841 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 17:04:29 crc kubenswrapper[4841]: E1203 17:04:29.823003 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="200ms" Dec 03 17:04:30 crc kubenswrapper[4841]: E1203 17:04:30.024435 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="400ms" Dec 03 17:04:30 crc kubenswrapper[4841]: E1203 17:04:30.425379 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="800ms" Dec 03 17:04:31 crc kubenswrapper[4841]: E1203 17:04:31.226714 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="1.6s" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.238074 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.238805 4841 status_manager.go:851] "Failed to get status for pod" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" pod="openshift-marketplace/redhat-operators-8pdm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8pdm9\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.239379 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.239799 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.240173 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.240483 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.240941 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.263593 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.263644 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:32 crc kubenswrapper[4841]: E1203 17:04:32.264301 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.265161 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:32 crc kubenswrapper[4841]: W1203 17:04:32.300416 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b760ee8a3b290844f2a6062b98930d9b6d768a52191fbb3cb744d3411f739d0c WatchSource:0}: Error finding container b760ee8a3b290844f2a6062b98930d9b6d768a52191fbb3cb744d3411f739d0c: Status 404 returned error can't find the container with id b760ee8a3b290844f2a6062b98930d9b6d768a52191fbb3cb744d3411f739d0c Dec 03 17:04:32 crc kubenswrapper[4841]: I1203 17:04:32.542525 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b760ee8a3b290844f2a6062b98930d9b6d768a52191fbb3cb744d3411f739d0c"} Dec 03 17:04:32 crc kubenswrapper[4841]: E1203 17:04:32.828432 4841 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.151:6443: connect: connection refused" interval="3.2s" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.550418 4841 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="a72a19a1a07b8c677cd923481de867533eccde93acd0ecdbc5e2564847d584c3" exitCode=0 Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.550640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"a72a19a1a07b8c677cd923481de867533eccde93acd0ecdbc5e2564847d584c3"} Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.550983 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.551027 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.551528 4841 status_manager.go:851] "Failed to get status for pod" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" pod="openshift-marketplace/redhat-operators-8pdm9" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-8pdm9\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:33 crc kubenswrapper[4841]: E1203 17:04:33.551639 4841 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.552014 4841 status_manager.go:851] "Failed to get status for pod" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" pod="openshift-marketplace/certified-operators-tcknq" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-tcknq\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.552547 4841 status_manager.go:851] "Failed to get status for pod" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" pod="openshift-marketplace/community-operators-8cl5k" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-8cl5k\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.553022 4841 status_manager.go:851] "Failed to get status for pod" podUID="5390187b-2235-4acf-9edf-385393ccaea8" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.553543 4841 status_manager.go:851] "Failed to get status for pod" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" pod="openshift-marketplace/redhat-marketplace-b9qwk" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-b9qwk\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:33 crc kubenswrapper[4841]: I1203 17:04:33.554020 4841 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.151:6443: connect: connection refused" Dec 03 17:04:34 crc kubenswrapper[4841]: I1203 17:04:34.565842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"373260f116301720627d8c581a1bc63da73537fee3d783194d0be2b43b012fdd"} Dec 03 17:04:34 crc kubenswrapper[4841]: I1203 17:04:34.566186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8c9135b39cd6a20a345c50685460a2924944dbe2e9fd3ad81c6d8f9540d6eee3"} Dec 03 17:04:34 crc kubenswrapper[4841]: I1203 17:04:34.566200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"710414986a6f73a8c93630a116d522f8217e2ef8b1452ab6e7d80e7d84b5e4b4"} Dec 03 17:04:34 crc kubenswrapper[4841]: I1203 17:04:34.566211 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"08f3abbb2f7e6986ba337aaf28c47cbce6b7194da7c3d603b0cc3e3e907a36c6"} Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.576157 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"dd5d86ab234277d8d55ca3c8286ddd0bf1a848227d78d626cd2616cc5a275cce"} Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.576352 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.576436 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.576457 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.579253 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.579309 4841 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3" exitCode=1 Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.579345 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3"} Dec 03 17:04:35 crc kubenswrapper[4841]: I1203 17:04:35.579781 4841 scope.go:117] "RemoveContainer" containerID="e5f9512ee6d38fc1f0c8dd24af31c5f9a0ff4623d4c1d01e43fce4c69ed13de3" Dec 03 17:04:36 crc kubenswrapper[4841]: I1203 17:04:36.586482 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 17:04:36 crc kubenswrapper[4841]: I1203 17:04:36.586752 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"709504a9aa2152e96dadc4b7c5a0e8db25e95f111d0f335684c00a3ceddb3647"} Dec 03 17:04:37 crc kubenswrapper[4841]: I1203 17:04:37.266142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:37 crc kubenswrapper[4841]: I1203 17:04:37.266437 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:37 crc kubenswrapper[4841]: I1203 17:04:37.271741 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:38 crc kubenswrapper[4841]: I1203 17:04:38.710979 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:04:38 crc kubenswrapper[4841]: I1203 17:04:38.715147 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:04:39 crc kubenswrapper[4841]: I1203 17:04:39.611601 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:04:39 crc kubenswrapper[4841]: I1203 17:04:39.943846 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" containerID="cri-o://c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b" gracePeriod=15 Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.352348 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427393 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427449 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427485 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427511 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427547 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427574 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427605 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427686 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427719 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427753 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427793 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427850 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.427880 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cvw8\" (UniqueName: \"kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8\") pod \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\" (UID: \"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92\") " Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.428678 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429051 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429085 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429271 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429514 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429539 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429586 4841 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429598 4841 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.429607 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.434559 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.435353 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.435551 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8" (OuterVolumeSpecName: "kube-api-access-5cvw8") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "kube-api-access-5cvw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.435998 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.436349 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.436693 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.441009 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.441523 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.443257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" (UID: "734e5c9e-ba3d-4e8b-8ce2-2c686f582a92"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530361 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530640 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530651 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cvw8\" (UniqueName: \"kubernetes.io/projected/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-kube-api-access-5cvw8\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530663 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530671 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530680 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530687 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530697 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.530707 4841 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.592446 4841 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.618543 4841 generic.go:334] "Generic (PLEG): container finished" podID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerID="c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b" exitCode=0 Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.618599 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" event={"ID":"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92","Type":"ContainerDied","Data":"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b"} Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.618619 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.618654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-v9zpn" event={"ID":"734e5c9e-ba3d-4e8b-8ce2-2c686f582a92","Type":"ContainerDied","Data":"c02000ebef136067b8fb72fcb8f57a4eee8f33528d7fd1b792d6a80dd184038a"} Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.618678 4841 scope.go:117] "RemoveContainer" containerID="c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.619546 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.619567 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.623509 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.643613 4841 scope.go:117] "RemoveContainer" containerID="c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b" Dec 03 17:04:40 crc kubenswrapper[4841]: E1203 17:04:40.644031 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b\": container with ID starting with c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b not found: ID does not exist" containerID="c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.644069 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b"} err="failed to get container status \"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b\": rpc error: code = NotFound desc = could not find container \"c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b\": container with ID starting with c6cfcba3da68355ceb851fcaf9ac80c30a1f05a1c2845c664a99b16138d58d5b not found: ID does not exist" Dec 03 17:04:40 crc kubenswrapper[4841]: I1203 17:04:40.747364 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d1db8bcd-19a0-46a1-9f1c-5288a6be660a" Dec 03 17:04:41 crc kubenswrapper[4841]: I1203 17:04:41.625011 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:41 crc kubenswrapper[4841]: I1203 17:04:41.625039 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:04:41 crc kubenswrapper[4841]: I1203 17:04:41.628565 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="d1db8bcd-19a0-46a1-9f1c-5288a6be660a" Dec 03 17:04:49 crc kubenswrapper[4841]: I1203 17:04:49.882890 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 17:04:50 crc kubenswrapper[4841]: I1203 17:04:50.745472 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 17:04:51 crc kubenswrapper[4841]: I1203 17:04:51.394624 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 17:04:51 crc kubenswrapper[4841]: I1203 17:04:51.530425 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.169489 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.380635 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.387843 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.413124 4841 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.601636 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.605090 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.775329 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 17:04:52 crc kubenswrapper[4841]: I1203 17:04:52.895181 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.357846 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.476286 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.686186 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.809325 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.871881 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.943432 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 17:04:53 crc kubenswrapper[4841]: I1203 17:04:53.955004 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.082373 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.127656 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.239225 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.240534 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.256606 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.503946 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.565292 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.571804 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.648619 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.827529 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.830765 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.890292 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.946516 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 17:04:54 crc kubenswrapper[4841]: I1203 17:04:54.949659 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.273724 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.470200 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.493592 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.549751 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.576145 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.706709 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.827015 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 17:04:55 crc kubenswrapper[4841]: I1203 17:04:55.934100 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.001962 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.146018 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.192817 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.200296 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.306803 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.333930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.338287 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.383315 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.387989 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.409707 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.441484 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.494503 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.580303 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.634490 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.786591 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.853303 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.871569 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.882843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 17:04:56 crc kubenswrapper[4841]: I1203 17:04:56.907997 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.100308 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.162761 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.185381 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.190879 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.261285 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.261331 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.305980 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.346382 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.353232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.425851 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.527740 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.538117 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.554236 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.654726 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.671651 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.844687 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.887339 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.897614 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 17:04:57 crc kubenswrapper[4841]: I1203 17:04:57.912942 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.048988 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.115251 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.127722 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.196559 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.200221 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.230856 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.246096 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.246103 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.281626 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.340306 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.367272 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.419457 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.428808 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.579713 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.586574 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.589406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.629949 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.783689 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.860313 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.862859 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.876232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 17:04:58 crc kubenswrapper[4841]: I1203 17:04:58.988398 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.069625 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.071241 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.076603 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.103943 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.135889 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.146190 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.234259 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.336917 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.371259 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.428786 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.524975 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.555988 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.582567 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.610838 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.612511 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.626717 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.663087 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.679626 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.731263 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.749898 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.753220 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.777513 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.786391 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.938478 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.959397 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.961812 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.965698 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 17:04:59 crc kubenswrapper[4841]: I1203 17:04:59.990383 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.000458 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.005016 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.147069 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.157428 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.306261 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.366418 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.425286 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.537985 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.777301 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.882763 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 17:05:00 crc kubenswrapper[4841]: I1203 17:05:00.982349 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.011235 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.103388 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.128706 4841 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.128814 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.166280 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.322622 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.347694 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.509751 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.596024 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.604570 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.623180 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.644512 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.672157 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.720262 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.777586 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.868420 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.902551 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 17:05:01 crc kubenswrapper[4841]: I1203 17:05:01.942757 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.037537 4841 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.054049 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.089591 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.127743 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.165031 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.175836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.261362 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.267407 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.454206 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.486351 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.515579 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.522388 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.536970 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.592292 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.593851 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.813530 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.898664 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 17:05:02 crc kubenswrapper[4841]: I1203 17:05:02.934569 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.023478 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.036035 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.058714 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.207664 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.266634 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.325353 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.389896 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.474856 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.578600 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.673900 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.687210 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.719790 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.931641 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.940937 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.954474 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 17:05:03 crc kubenswrapper[4841]: I1203 17:05:03.962402 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.014409 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.048474 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.261852 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.366137 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.398025 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.506475 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.662544 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.712496 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.754639 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.900947 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.904470 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 17:05:04 crc kubenswrapper[4841]: I1203 17:05:04.988168 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.013433 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.034194 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.108781 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.214029 4841 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.377937 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.492524 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.571096 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.640939 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.743630 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.838232 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 17:05:05 crc kubenswrapper[4841]: I1203 17:05:05.842847 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.048301 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.202155 4841 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.204316 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.204297849 podStartE2EDuration="45.204297849s" podCreationTimestamp="2025-12-03 17:04:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:04:40.605239207 +0000 UTC m=+274.992759934" watchObservedRunningTime="2025-12-03 17:05:06.204297849 +0000 UTC m=+300.591818576" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206355 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-v9zpn","openshift-marketplace/certified-operators-tcknq","openshift-kube-apiserver/kube-apiserver-crc","openshift-marketplace/community-operators-8cl5k","openshift-marketplace/redhat-marketplace-b9qwk"] Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206420 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9","openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206600 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5390187b-2235-4acf-9edf-385393ccaea8" containerName="installer" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206620 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5390187b-2235-4acf-9edf-385393ccaea8" containerName="installer" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206631 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="extract-content" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206639 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="extract-content" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206648 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="extract-utilities" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206655 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="extract-utilities" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206665 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206672 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206684 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206691 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206703 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206709 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206719 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="extract-utilities" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206725 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="extract-utilities" Dec 03 17:05:06 crc kubenswrapper[4841]: E1203 17:05:06.206738 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="extract-content" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206743 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="extract-content" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206817 4841 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206835 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206845 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="45ccd0cc-5cf9-435b-9d20-2b3e4fd485d8" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206854 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" containerName="oauth-openshift" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206867 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5390187b-2235-4acf-9edf-385393ccaea8" containerName="installer" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.206875 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" containerName="registry-server" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.207294 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.209286 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.210017 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.210172 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.210404 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.211019 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.211360 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.212049 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.212087 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.212127 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.212822 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.213408 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.213420 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.213481 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.220996 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.223772 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.230459 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.239825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.239880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.239932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.239962 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-policies\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.239995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240026 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240090 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-dir\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240208 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240252 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.240290 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh9ts\" (UniqueName: \"kubernetes.io/projected/55914512-a084-4b90-aa4a-c21885dc8ee6-kube-api-access-kh9ts\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.245137 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e9e3466-59a6-466a-bb38-936fc4be6f9a" path="/var/lib/kubelet/pods/0e9e3466-59a6-466a-bb38-936fc4be6f9a/volumes" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.245836 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16" path="/var/lib/kubelet/pods/66d0f89a-d9a1-41f1-b1be-6ed6c9e69d16/volumes" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.246698 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="734e5c9e-ba3d-4e8b-8ce2-2c686f582a92" path="/var/lib/kubelet/pods/734e5c9e-ba3d-4e8b-8ce2-2c686f582a92/volumes" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.248004 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9585f22b-d1dd-499c-a9e8-37c212f22844" path="/var/lib/kubelet/pods/9585f22b-d1dd-499c-a9e8-37c212f22844/volumes" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.274686 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.276227 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.301792 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=26.301776319 podStartE2EDuration="26.301776319s" podCreationTimestamp="2025-12-03 17:04:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:05:06.264951356 +0000 UTC m=+300.652472083" watchObservedRunningTime="2025-12-03 17:05:06.301776319 +0000 UTC m=+300.689297046" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.341530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.341981 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.342150 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh9ts\" (UniqueName: \"kubernetes.io/projected/55914512-a084-4b90-aa4a-c21885dc8ee6-kube-api-access-kh9ts\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.342559 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.342686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.342798 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.342907 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-policies\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.343464 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.343586 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.343721 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.343830 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-dir\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.343935 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.344095 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.344178 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-dir\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.344313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.344575 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-audit-policies\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.345112 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-cliconfig\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.345514 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.345585 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-service-ca\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.346399 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.347233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-router-certs\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.347240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-serving-cert\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.347553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.348546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-error\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.349037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-session\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.349065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-template-login\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.349077 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.352162 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.352270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/55914512-a084-4b90-aa4a-c21885dc8ee6-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.359396 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh9ts\" (UniqueName: \"kubernetes.io/projected/55914512-a084-4b90-aa4a-c21885dc8ee6-kube-api-access-kh9ts\") pod \"oauth-openshift-679cb4ddc5-xf5m9\" (UID: \"55914512-a084-4b90-aa4a-c21885dc8ee6\") " pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.456812 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.526040 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.581116 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.684843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.700683 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9"] Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.721983 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.769535 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.782203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" event={"ID":"55914512-a084-4b90-aa4a-c21885dc8ee6","Type":"ContainerStarted","Data":"c831c677733b839ccb1912190556d36b5df482fc30d9288b1cd3d36ae50498cd"} Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.803239 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.843014 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 17:05:06 crc kubenswrapper[4841]: I1203 17:05:06.947813 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.119960 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.217427 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.400542 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.792031 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" event={"ID":"55914512-a084-4b90-aa4a-c21885dc8ee6","Type":"ContainerStarted","Data":"e182422e6b52763f936d96459421823bf91782703e30311982611104f2bcc3a8"} Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.823805 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" podStartSLOduration=53.823786471 podStartE2EDuration="53.823786471s" podCreationTimestamp="2025-12-03 17:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:05:07.821924817 +0000 UTC m=+302.209445554" watchObservedRunningTime="2025-12-03 17:05:07.823786471 +0000 UTC m=+302.211307198" Dec 03 17:05:07 crc kubenswrapper[4841]: I1203 17:05:07.931092 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 17:05:08 crc kubenswrapper[4841]: I1203 17:05:08.014465 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 17:05:08 crc kubenswrapper[4841]: I1203 17:05:08.246388 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 17:05:08 crc kubenswrapper[4841]: I1203 17:05:08.698554 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 17:05:08 crc kubenswrapper[4841]: I1203 17:05:08.796333 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:08 crc kubenswrapper[4841]: I1203 17:05:08.802096 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-679cb4ddc5-xf5m9" Dec 03 17:05:09 crc kubenswrapper[4841]: I1203 17:05:09.613646 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 17:05:10 crc kubenswrapper[4841]: I1203 17:05:10.249464 4841 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 17:05:14 crc kubenswrapper[4841]: I1203 17:05:14.384540 4841 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 17:05:14 crc kubenswrapper[4841]: I1203 17:05:14.384984 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://fd94ff6354fbf6166fe8da7b0a310cbf650c0a4564368cebeaf0d52872d1237d" gracePeriod=5 Dec 03 17:05:19 crc kubenswrapper[4841]: I1203 17:05:19.853765 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 17:05:19 crc kubenswrapper[4841]: I1203 17:05:19.854267 4841 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="fd94ff6354fbf6166fe8da7b0a310cbf650c0a4564368cebeaf0d52872d1237d" exitCode=137 Dec 03 17:05:19 crc kubenswrapper[4841]: I1203 17:05:19.954030 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 17:05:19 crc kubenswrapper[4841]: I1203 17:05:19.954118 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.069804 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.069854 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.069958 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.069969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070040 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070231 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070479 4841 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070503 4841 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070516 4841 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.070549 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.077755 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.172338 4841 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.172390 4841 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.247217 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.247867 4841 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.259703 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.259777 4841 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="86a9f27e-8c57-4580-a3c2-d4fdc0d22180" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.270365 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.270408 4841 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="86a9f27e-8c57-4580-a3c2-d4fdc0d22180" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.867611 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.867721 4841 scope.go:117] "RemoveContainer" containerID="fd94ff6354fbf6166fe8da7b0a310cbf650c0a4564368cebeaf0d52872d1237d" Dec 03 17:05:20 crc kubenswrapper[4841]: I1203 17:05:20.867816 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 17:05:30 crc kubenswrapper[4841]: I1203 17:05:30.816479 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:05:30 crc kubenswrapper[4841]: I1203 17:05:30.816964 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerName="controller-manager" containerID="cri-o://1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935" gracePeriod=30 Dec 03 17:05:30 crc kubenswrapper[4841]: I1203 17:05:30.938403 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:05:30 crc kubenswrapper[4841]: I1203 17:05:30.938852 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerName="route-controller-manager" containerID="cri-o://4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887" gracePeriod=30 Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.227594 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.311704 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.324449 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert\") pod \"876346bb-a538-4b29-a71f-6ca64d8b60f0\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.324636 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchxw\" (UniqueName: \"kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw\") pod \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.324689 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config\") pod \"876346bb-a538-4b29-a71f-6ca64d8b60f0\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.324714 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles\") pod \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.325662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bd9368a6-dc6b-42fd-9062-a01612ceb28c" (UID: "bd9368a6-dc6b-42fd-9062-a01612ceb28c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.325892 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config" (OuterVolumeSpecName: "config") pod "876346bb-a538-4b29-a71f-6ca64d8b60f0" (UID: "876346bb-a538-4b29-a71f-6ca64d8b60f0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.326878 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca\") pod \"876346bb-a538-4b29-a71f-6ca64d8b60f0\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.326950 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7jmx\" (UniqueName: \"kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx\") pod \"876346bb-a538-4b29-a71f-6ca64d8b60f0\" (UID: \"876346bb-a538-4b29-a71f-6ca64d8b60f0\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.326979 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config\") pod \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.327014 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert\") pod \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.327045 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca\") pod \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\" (UID: \"bd9368a6-dc6b-42fd-9062-a01612ceb28c\") " Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.327356 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.327371 4841 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.327707 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca" (OuterVolumeSpecName: "client-ca") pod "bd9368a6-dc6b-42fd-9062-a01612ceb28c" (UID: "bd9368a6-dc6b-42fd-9062-a01612ceb28c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.328013 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca" (OuterVolumeSpecName: "client-ca") pod "876346bb-a538-4b29-a71f-6ca64d8b60f0" (UID: "876346bb-a538-4b29-a71f-6ca64d8b60f0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.328723 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config" (OuterVolumeSpecName: "config") pod "bd9368a6-dc6b-42fd-9062-a01612ceb28c" (UID: "bd9368a6-dc6b-42fd-9062-a01612ceb28c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.331244 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "876346bb-a538-4b29-a71f-6ca64d8b60f0" (UID: "876346bb-a538-4b29-a71f-6ca64d8b60f0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.332234 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx" (OuterVolumeSpecName: "kube-api-access-d7jmx") pod "876346bb-a538-4b29-a71f-6ca64d8b60f0" (UID: "876346bb-a538-4b29-a71f-6ca64d8b60f0"). InnerVolumeSpecName "kube-api-access-d7jmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.334999 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw" (OuterVolumeSpecName: "kube-api-access-jchxw") pod "bd9368a6-dc6b-42fd-9062-a01612ceb28c" (UID: "bd9368a6-dc6b-42fd-9062-a01612ceb28c"). InnerVolumeSpecName "kube-api-access-jchxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.335104 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bd9368a6-dc6b-42fd-9062-a01612ceb28c" (UID: "bd9368a6-dc6b-42fd-9062-a01612ceb28c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428870 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchxw\" (UniqueName: \"kubernetes.io/projected/bd9368a6-dc6b-42fd-9062-a01612ceb28c-kube-api-access-jchxw\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428925 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/876346bb-a538-4b29-a71f-6ca64d8b60f0-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428934 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7jmx\" (UniqueName: \"kubernetes.io/projected/876346bb-a538-4b29-a71f-6ca64d8b60f0-kube-api-access-d7jmx\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428943 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428951 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9368a6-dc6b-42fd-9062-a01612ceb28c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428959 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bd9368a6-dc6b-42fd-9062-a01612ceb28c-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.428966 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/876346bb-a538-4b29-a71f-6ca64d8b60f0-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.943040 4841 generic.go:334] "Generic (PLEG): container finished" podID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerID="4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887" exitCode=0 Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.943086 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.943104 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" event={"ID":"876346bb-a538-4b29-a71f-6ca64d8b60f0","Type":"ContainerDied","Data":"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887"} Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.943539 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz" event={"ID":"876346bb-a538-4b29-a71f-6ca64d8b60f0","Type":"ContainerDied","Data":"811ad334cfe9d7f26f06b65278eec7890241fc0cec2d95c00ae66cc23298e260"} Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.943563 4841 scope.go:117] "RemoveContainer" containerID="4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.944885 4841 generic.go:334] "Generic (PLEG): container finished" podID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerID="1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935" exitCode=0 Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.944938 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" event={"ID":"bd9368a6-dc6b-42fd-9062-a01612ceb28c","Type":"ContainerDied","Data":"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935"} Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.944965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" event={"ID":"bd9368a6-dc6b-42fd-9062-a01612ceb28c","Type":"ContainerDied","Data":"11de0bd5c737d7a708f0367cab344c6d108fd155bc7c8f8bafc96be0c4f3dabf"} Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.944982 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mj44m" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.958246 4841 scope.go:117] "RemoveContainer" containerID="4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887" Dec 03 17:05:31 crc kubenswrapper[4841]: E1203 17:05:31.958599 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887\": container with ID starting with 4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887 not found: ID does not exist" containerID="4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.958653 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887"} err="failed to get container status \"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887\": rpc error: code = NotFound desc = could not find container \"4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887\": container with ID starting with 4d7c6654e43ade2f61b836ea238e5b5b4743e3869b23d23fb912ca49bcbf3887 not found: ID does not exist" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.958674 4841 scope.go:117] "RemoveContainer" containerID="1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.988280 4841 scope.go:117] "RemoveContainer" containerID="1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935" Dec 03 17:05:31 crc kubenswrapper[4841]: E1203 17:05:31.989050 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935\": container with ID starting with 1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935 not found: ID does not exist" containerID="1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.989091 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935"} err="failed to get container status \"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935\": rpc error: code = NotFound desc = could not find container \"1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935\": container with ID starting with 1ff1393358fa3b4dff912810cc88385c0804da60b446a9d78fae3c80625af935 not found: ID does not exist" Dec 03 17:05:31 crc kubenswrapper[4841]: I1203 17:05:31.997395 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.011688 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-59tlz"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.017004 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.020357 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mj44m"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.162559 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:05:32 crc kubenswrapper[4841]: E1203 17:05:32.162796 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.162810 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 17:05:32 crc kubenswrapper[4841]: E1203 17:05:32.162827 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerName="route-controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.162836 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerName="route-controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: E1203 17:05:32.162851 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerName="controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.162859 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerName="controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.163000 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" containerName="route-controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.163013 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" containerName="controller-manager" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.163025 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.163476 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.165851 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.166134 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.166429 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.166627 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.166760 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.167171 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.168351 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-598fffb67c-gjsp7"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.169070 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.171702 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.171847 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.172079 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.173400 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.173511 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.173677 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.173765 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.189317 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.214377 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598fffb67c-gjsp7"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241329 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241379 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-config\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241413 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3955b4a2-57e3-4990-b31a-ef55e401169e-serving-cert\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241435 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-client-ca\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241470 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-proxy-ca-bundles\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241487 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241512 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfvr7\" (UniqueName: \"kubernetes.io/projected/3955b4a2-57e3-4990-b31a-ef55e401169e-kube-api-access-qfvr7\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.241602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhc4l\" (UniqueName: \"kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.246431 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="876346bb-a538-4b29-a71f-6ca64d8b60f0" path="/var/lib/kubelet/pods/876346bb-a538-4b29-a71f-6ca64d8b60f0/volumes" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.247113 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd9368a6-dc6b-42fd-9062-a01612ceb28c" path="/var/lib/kubelet/pods/bd9368a6-dc6b-42fd-9062-a01612ceb28c/volumes" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.343032 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-config\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.343123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3955b4a2-57e3-4990-b31a-ef55e401169e-serving-cert\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.344491 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-config\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.344552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-client-ca\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.344573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.345221 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-client-ca\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.344589 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-proxy-ca-bundles\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.345286 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.345317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfvr7\" (UniqueName: \"kubernetes.io/projected/3955b4a2-57e3-4990-b31a-ef55e401169e-kube-api-access-qfvr7\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.345555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.345633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3955b4a2-57e3-4990-b31a-ef55e401169e-proxy-ca-bundles\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.346117 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.346194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhc4l\" (UniqueName: \"kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.346250 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.347651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3955b4a2-57e3-4990-b31a-ef55e401169e-serving-cert\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.349011 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.364652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfvr7\" (UniqueName: \"kubernetes.io/projected/3955b4a2-57e3-4990-b31a-ef55e401169e-kube-api-access-qfvr7\") pod \"controller-manager-598fffb67c-gjsp7\" (UID: \"3955b4a2-57e3-4990-b31a-ef55e401169e\") " pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.373604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhc4l\" (UniqueName: \"kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l\") pod \"route-controller-manager-7bf9495b6c-z99rx\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.486933 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.502221 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.768858 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.821371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-598fffb67c-gjsp7"] Dec 03 17:05:32 crc kubenswrapper[4841]: W1203 17:05:32.832665 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3955b4a2_57e3_4990_b31a_ef55e401169e.slice/crio-0d166abb22cee365b97abc40608d2f35267b074051217fec51ca52b646ca0514 WatchSource:0}: Error finding container 0d166abb22cee365b97abc40608d2f35267b074051217fec51ca52b646ca0514: Status 404 returned error can't find the container with id 0d166abb22cee365b97abc40608d2f35267b074051217fec51ca52b646ca0514 Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.952567 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" event={"ID":"064467ef-c3e0-4219-936c-9f0d581b065d","Type":"ContainerStarted","Data":"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd"} Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.952615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" event={"ID":"064467ef-c3e0-4219-936c-9f0d581b065d","Type":"ContainerStarted","Data":"d18c288ac6e3663a38b575bd8bcf0da0d066a9696e41bfd12ee05b56d0a47484"} Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.952925 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.953935 4841 patch_prober.go:28] interesting pod/route-controller-manager-7bf9495b6c-z99rx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.953977 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.956569 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" event={"ID":"3955b4a2-57e3-4990-b31a-ef55e401169e","Type":"ContainerStarted","Data":"0d166abb22cee365b97abc40608d2f35267b074051217fec51ca52b646ca0514"} Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.956838 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.958978 4841 patch_prober.go:28] interesting pod/controller-manager-598fffb67c-gjsp7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" start-of-body= Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.959025 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" podUID="3955b4a2-57e3-4990-b31a-ef55e401169e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": dial tcp 10.217.0.58:8443: connect: connection refused" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.970671 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" podStartSLOduration=2.970653817 podStartE2EDuration="2.970653817s" podCreationTimestamp="2025-12-03 17:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:05:32.967577706 +0000 UTC m=+327.355098433" watchObservedRunningTime="2025-12-03 17:05:32.970653817 +0000 UTC m=+327.358174544" Dec 03 17:05:32 crc kubenswrapper[4841]: I1203 17:05:32.988970 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" podStartSLOduration=2.988953783 podStartE2EDuration="2.988953783s" podCreationTimestamp="2025-12-03 17:05:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:05:32.987947456 +0000 UTC m=+327.375468193" watchObservedRunningTime="2025-12-03 17:05:32.988953783 +0000 UTC m=+327.376474510" Dec 03 17:05:33 crc kubenswrapper[4841]: I1203 17:05:33.962268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" event={"ID":"3955b4a2-57e3-4990-b31a-ef55e401169e","Type":"ContainerStarted","Data":"9dd403d370c3c82746f5f14e7d8c9cd82fdc43432693e4b927d3f502a632e66e"} Dec 03 17:05:33 crc kubenswrapper[4841]: I1203 17:05:33.965868 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-598fffb67c-gjsp7" Dec 03 17:05:33 crc kubenswrapper[4841]: I1203 17:05:33.967778 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:05:50 crc kubenswrapper[4841]: I1203 17:05:50.817137 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:05:50 crc kubenswrapper[4841]: I1203 17:05:50.818096 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8pdm9" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="registry-server" containerID="cri-o://5d79d13e767f4c0046bfaceb777c6e578a636364aa802ffe08e1095e8295ecd7" gracePeriod=2 Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.164550 4841 generic.go:334] "Generic (PLEG): container finished" podID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerID="5d79d13e767f4c0046bfaceb777c6e578a636364aa802ffe08e1095e8295ecd7" exitCode=0 Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.164729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerDied","Data":"5d79d13e767f4c0046bfaceb777c6e578a636364aa802ffe08e1095e8295ecd7"} Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.346774 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.430383 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s99j\" (UniqueName: \"kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j\") pod \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.430475 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content\") pod \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.430530 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities\") pod \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\" (UID: \"09e5fe7e-183b-431c-a431-e58aeba0e1aa\") " Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.431531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities" (OuterVolumeSpecName: "utilities") pod "09e5fe7e-183b-431c-a431-e58aeba0e1aa" (UID: "09e5fe7e-183b-431c-a431-e58aeba0e1aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.444090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j" (OuterVolumeSpecName: "kube-api-access-7s99j") pod "09e5fe7e-183b-431c-a431-e58aeba0e1aa" (UID: "09e5fe7e-183b-431c-a431-e58aeba0e1aa"). InnerVolumeSpecName "kube-api-access-7s99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.532130 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.532425 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s99j\" (UniqueName: \"kubernetes.io/projected/09e5fe7e-183b-431c-a431-e58aeba0e1aa-kube-api-access-7s99j\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.550923 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09e5fe7e-183b-431c-a431-e58aeba0e1aa" (UID: "09e5fe7e-183b-431c-a431-e58aeba0e1aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:05:51 crc kubenswrapper[4841]: I1203 17:05:51.633788 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09e5fe7e-183b-431c-a431-e58aeba0e1aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.179597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8pdm9" event={"ID":"09e5fe7e-183b-431c-a431-e58aeba0e1aa","Type":"ContainerDied","Data":"8528be7e01068c63587f4a63fa0ed1c726f36b4d4c02353ecb696ddc841b6356"} Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.179665 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8pdm9" Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.179675 4841 scope.go:117] "RemoveContainer" containerID="5d79d13e767f4c0046bfaceb777c6e578a636364aa802ffe08e1095e8295ecd7" Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.204852 4841 scope.go:117] "RemoveContainer" containerID="115887d5dee0eb158c5be9470bbbca73271555f6219dfd5a29b9473f24a8e10a" Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.220058 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.225676 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8pdm9"] Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.247838 4841 scope.go:117] "RemoveContainer" containerID="572ea3bb9967c3b301eb5b11b7d08ee70c33cd0f5b5aeb5929e73aa78550c1b0" Dec 03 17:05:52 crc kubenswrapper[4841]: I1203 17:05:52.249825 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" path="/var/lib/kubelet/pods/09e5fe7e-183b-431c-a431-e58aeba0e1aa/volumes" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.038547 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.040009 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dshxz" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="registry-server" containerID="cri-o://ff00c6f334f026ce9d777ebc4c69728dbcd264c03f8734e4abd9a21ad0ed7d28" gracePeriod=30 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.062316 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.062844 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-nr854" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="registry-server" containerID="cri-o://30958f4cd329db5ab4713fa275cbdaa5a247e2af6710d7b7cb275b2eaa8f3050" gracePeriod=30 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.065363 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.065553 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" containerID="cri-o://3f8e2c9167ae16077eab874a8a0c00dadcb95f3fc02d4427ebabdd11d585f04f" gracePeriod=30 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.077866 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.078102 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4nlrj" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="registry-server" containerID="cri-o://97c27b5a771ebdb220ec75ef919975bee194d96e69b2b8c2ae5ede7432414b33" gracePeriod=30 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.099848 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.100172 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-glndx" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="registry-server" containerID="cri-o://c94dcdd6c11ee6a14ed9505b6a249e47937a04dab5188ebdff788d2b54377bcc" gracePeriod=30 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.104976 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhx4z"] Dec 03 17:06:08 crc kubenswrapper[4841]: E1203 17:06:08.105212 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="extract-content" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.105223 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="extract-content" Dec 03 17:06:08 crc kubenswrapper[4841]: E1203 17:06:08.105247 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="registry-server" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.105255 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="registry-server" Dec 03 17:06:08 crc kubenswrapper[4841]: E1203 17:06:08.105282 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="extract-utilities" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.105288 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="extract-utilities" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.105391 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="09e5fe7e-183b-431c-a431-e58aeba0e1aa" containerName="registry-server" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.105801 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.117709 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhx4z"] Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.176894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/b200dd17-70ee-42af-a890-b7f748be7b01-kube-api-access-q9n2p\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.177001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.177092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.277722 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.277768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/b200dd17-70ee-42af-a890-b7f748be7b01-kube-api-access-q9n2p\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.277801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.279800 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.285043 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b200dd17-70ee-42af-a890-b7f748be7b01-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.292133 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerID="97c27b5a771ebdb220ec75ef919975bee194d96e69b2b8c2ae5ede7432414b33" exitCode=0 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.292187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerDied","Data":"97c27b5a771ebdb220ec75ef919975bee194d96e69b2b8c2ae5ede7432414b33"} Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.295470 4841 generic.go:334] "Generic (PLEG): container finished" podID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerID="30958f4cd329db5ab4713fa275cbdaa5a247e2af6710d7b7cb275b2eaa8f3050" exitCode=0 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.295637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerDied","Data":"30958f4cd329db5ab4713fa275cbdaa5a247e2af6710d7b7cb275b2eaa8f3050"} Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.297609 4841 generic.go:334] "Generic (PLEG): container finished" podID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerID="c94dcdd6c11ee6a14ed9505b6a249e47937a04dab5188ebdff788d2b54377bcc" exitCode=0 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.297652 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerDied","Data":"c94dcdd6c11ee6a14ed9505b6a249e47937a04dab5188ebdff788d2b54377bcc"} Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.298179 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9n2p\" (UniqueName: \"kubernetes.io/projected/b200dd17-70ee-42af-a890-b7f748be7b01-kube-api-access-q9n2p\") pod \"marketplace-operator-79b997595-qhx4z\" (UID: \"b200dd17-70ee-42af-a890-b7f748be7b01\") " pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.299656 4841 generic.go:334] "Generic (PLEG): container finished" podID="68909a3d-4731-4851-a511-0b66e05d3741" containerID="ff00c6f334f026ce9d777ebc4c69728dbcd264c03f8734e4abd9a21ad0ed7d28" exitCode=0 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.299692 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerDied","Data":"ff00c6f334f026ce9d777ebc4c69728dbcd264c03f8734e4abd9a21ad0ed7d28"} Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.301449 4841 generic.go:334] "Generic (PLEG): container finished" podID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerID="3f8e2c9167ae16077eab874a8a0c00dadcb95f3fc02d4427ebabdd11d585f04f" exitCode=0 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.301471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" event={"ID":"0648214d-f0e4-4a8d-8f0d-2c3751c8e369","Type":"ContainerDied","Data":"3f8e2c9167ae16077eab874a8a0c00dadcb95f3fc02d4427ebabdd11d585f04f"} Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.430987 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.529151 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.580717 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vn8zd\" (UniqueName: \"kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd\") pod \"68909a3d-4731-4851-a511-0b66e05d3741\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.580803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities\") pod \"68909a3d-4731-4851-a511-0b66e05d3741\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.581197 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content\") pod \"68909a3d-4731-4851-a511-0b66e05d3741\" (UID: \"68909a3d-4731-4851-a511-0b66e05d3741\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.585716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd" (OuterVolumeSpecName: "kube-api-access-vn8zd") pod "68909a3d-4731-4851-a511-0b66e05d3741" (UID: "68909a3d-4731-4851-a511-0b66e05d3741"). InnerVolumeSpecName "kube-api-access-vn8zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.586253 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities" (OuterVolumeSpecName: "utilities") pod "68909a3d-4731-4851-a511-0b66e05d3741" (UID: "68909a3d-4731-4851-a511-0b66e05d3741"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.630416 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68909a3d-4731-4851-a511-0b66e05d3741" (UID: "68909a3d-4731-4851-a511-0b66e05d3741"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.646613 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.652879 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.655608 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.669530 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.684578 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vn8zd\" (UniqueName: \"kubernetes.io/projected/68909a3d-4731-4851-a511-0b66e05d3741-kube-api-access-vn8zd\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.684625 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.684642 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68909a3d-4731-4851-a511-0b66e05d3741-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785487 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk5fp\" (UniqueName: \"kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp\") pod \"f9594e34-3c26-4c76-b090-c9d8218398a6\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785537 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q48cg\" (UniqueName: \"kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg\") pod \"958ead90-9bd3-4c1b-b9e5-21378ecff345\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities\") pod \"73b4070c-62dc-49b6-b2fe-8ae468318da3\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities\") pod \"958ead90-9bd3-4c1b-b9e5-21378ecff345\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785629 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flmb5\" (UniqueName: \"kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5\") pod \"73b4070c-62dc-49b6-b2fe-8ae468318da3\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities\") pod \"f9594e34-3c26-4c76-b090-c9d8218398a6\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785692 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content\") pod \"73b4070c-62dc-49b6-b2fe-8ae468318da3\" (UID: \"73b4070c-62dc-49b6-b2fe-8ae468318da3\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785711 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca\") pod \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785727 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content\") pod \"f9594e34-3c26-4c76-b090-c9d8218398a6\" (UID: \"f9594e34-3c26-4c76-b090-c9d8218398a6\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk99j\" (UniqueName: \"kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j\") pod \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785759 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content\") pod \"958ead90-9bd3-4c1b-b9e5-21378ecff345\" (UID: \"958ead90-9bd3-4c1b-b9e5-21378ecff345\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.785778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics\") pod \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\" (UID: \"0648214d-f0e4-4a8d-8f0d-2c3751c8e369\") " Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.786452 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities" (OuterVolumeSpecName: "utilities") pod "958ead90-9bd3-4c1b-b9e5-21378ecff345" (UID: "958ead90-9bd3-4c1b-b9e5-21378ecff345"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.786505 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "0648214d-f0e4-4a8d-8f0d-2c3751c8e369" (UID: "0648214d-f0e4-4a8d-8f0d-2c3751c8e369"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.786541 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities" (OuterVolumeSpecName: "utilities") pod "73b4070c-62dc-49b6-b2fe-8ae468318da3" (UID: "73b4070c-62dc-49b6-b2fe-8ae468318da3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.787392 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities" (OuterVolumeSpecName: "utilities") pod "f9594e34-3c26-4c76-b090-c9d8218398a6" (UID: "f9594e34-3c26-4c76-b090-c9d8218398a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.788932 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5" (OuterVolumeSpecName: "kube-api-access-flmb5") pod "73b4070c-62dc-49b6-b2fe-8ae468318da3" (UID: "73b4070c-62dc-49b6-b2fe-8ae468318da3"). InnerVolumeSpecName "kube-api-access-flmb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.788972 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp" (OuterVolumeSpecName: "kube-api-access-bk5fp") pod "f9594e34-3c26-4c76-b090-c9d8218398a6" (UID: "f9594e34-3c26-4c76-b090-c9d8218398a6"). InnerVolumeSpecName "kube-api-access-bk5fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.789416 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j" (OuterVolumeSpecName: "kube-api-access-bk99j") pod "0648214d-f0e4-4a8d-8f0d-2c3751c8e369" (UID: "0648214d-f0e4-4a8d-8f0d-2c3751c8e369"). InnerVolumeSpecName "kube-api-access-bk99j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.789476 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "0648214d-f0e4-4a8d-8f0d-2c3751c8e369" (UID: "0648214d-f0e4-4a8d-8f0d-2c3751c8e369"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.792458 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg" (OuterVolumeSpecName: "kube-api-access-q48cg") pod "958ead90-9bd3-4c1b-b9e5-21378ecff345" (UID: "958ead90-9bd3-4c1b-b9e5-21378ecff345"). InnerVolumeSpecName "kube-api-access-q48cg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.804164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f9594e34-3c26-4c76-b090-c9d8218398a6" (UID: "f9594e34-3c26-4c76-b090-c9d8218398a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.844790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "73b4070c-62dc-49b6-b2fe-8ae468318da3" (UID: "73b4070c-62dc-49b6-b2fe-8ae468318da3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887564 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk5fp\" (UniqueName: \"kubernetes.io/projected/f9594e34-3c26-4c76-b090-c9d8218398a6-kube-api-access-bk5fp\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887615 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q48cg\" (UniqueName: \"kubernetes.io/projected/958ead90-9bd3-4c1b-b9e5-21378ecff345-kube-api-access-q48cg\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887636 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887656 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887673 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flmb5\" (UniqueName: \"kubernetes.io/projected/73b4070c-62dc-49b6-b2fe-8ae468318da3-kube-api-access-flmb5\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887690 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887708 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/73b4070c-62dc-49b6-b2fe-8ae468318da3-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887724 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f9594e34-3c26-4c76-b090-c9d8218398a6-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887741 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887757 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk99j\" (UniqueName: \"kubernetes.io/projected/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-kube-api-access-bk99j\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.887774 4841 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/0648214d-f0e4-4a8d-8f0d-2c3751c8e369-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.893738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "958ead90-9bd3-4c1b-b9e5-21378ecff345" (UID: "958ead90-9bd3-4c1b-b9e5-21378ecff345"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.929030 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qhx4z"] Dec 03 17:06:08 crc kubenswrapper[4841]: W1203 17:06:08.935010 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb200dd17_70ee_42af_a890_b7f748be7b01.slice/crio-f65d9167ace8e65b2dbcd4a242e72fa22cbcdb44df756e82e12d1ecbd0d53a26 WatchSource:0}: Error finding container f65d9167ace8e65b2dbcd4a242e72fa22cbcdb44df756e82e12d1ecbd0d53a26: Status 404 returned error can't find the container with id f65d9167ace8e65b2dbcd4a242e72fa22cbcdb44df756e82e12d1ecbd0d53a26 Dec 03 17:06:08 crc kubenswrapper[4841]: I1203 17:06:08.989339 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/958ead90-9bd3-4c1b-b9e5-21378ecff345-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.312001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4nlrj" event={"ID":"f9594e34-3c26-4c76-b090-c9d8218398a6","Type":"ContainerDied","Data":"feec0fa44f50fc643834a5ecd1dbcbaae36488a498f27bfe8e9687258e53cc2a"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.312982 4841 scope.go:117] "RemoveContainer" containerID="97c27b5a771ebdb220ec75ef919975bee194d96e69b2b8c2ae5ede7432414b33" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.312017 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4nlrj" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.316414 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.316456 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.316975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-nr854" event={"ID":"73b4070c-62dc-49b6-b2fe-8ae468318da3","Type":"ContainerDied","Data":"ab7ab9c7da70f9ede7e5edcc7f3f37d5cf9f534fb1326e4ef72235f8ef6a3a95"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.317036 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-nr854" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.318749 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-glndx" event={"ID":"958ead90-9bd3-4c1b-b9e5-21378ecff345","Type":"ContainerDied","Data":"69bdebac4d4a7425247e1c67b2ebe6f9be56e9591fd8bb9ac1e657579d4a7a63"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.318825 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-glndx" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.328466 4841 scope.go:117] "RemoveContainer" containerID="351945b9e578ccdfaa9fe7ac76a60a905b539dd2487be7e25915fa87e9faa82f" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.331066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dshxz" event={"ID":"68909a3d-4731-4851-a511-0b66e05d3741","Type":"ContainerDied","Data":"4f9c8101b7c5131f276091ce9a1e61182e5553d6de7de744d67117b8a6024718"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.331279 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dshxz" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.339008 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.339000 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-p6dg7" event={"ID":"0648214d-f0e4-4a8d-8f0d-2c3751c8e369","Type":"ContainerDied","Data":"c294bb47c11aef06babbc3dfa2cb769e394d2f7c5abed454dc5e435421a7dc3b"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.342141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" event={"ID":"b200dd17-70ee-42af-a890-b7f748be7b01","Type":"ContainerStarted","Data":"6f2526efac9bf13cc837af9546762b5640ec06cc87543c28a10267cf34c8214c"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.342176 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" event={"ID":"b200dd17-70ee-42af-a890-b7f748be7b01","Type":"ContainerStarted","Data":"f65d9167ace8e65b2dbcd4a242e72fa22cbcdb44df756e82e12d1ecbd0d53a26"} Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.343185 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.351003 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.358851 4841 scope.go:117] "RemoveContainer" containerID="98b093b496796467772e69c97d01c03e7307d386f8e7956241f9a1bc80ae68f6" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.361703 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.395109 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4nlrj"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.397904 4841 scope.go:117] "RemoveContainer" containerID="30958f4cd329db5ab4713fa275cbdaa5a247e2af6710d7b7cb275b2eaa8f3050" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.398278 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qhx4z" podStartSLOduration=1.398258689 podStartE2EDuration="1.398258689s" podCreationTimestamp="2025-12-03 17:06:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:06:09.371292493 +0000 UTC m=+363.758813220" watchObservedRunningTime="2025-12-03 17:06:09.398258689 +0000 UTC m=+363.785779416" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.416512 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.427684 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-nr854"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.432758 4841 scope.go:117] "RemoveContainer" containerID="95720d968d815be533c1d2adb00983c1f208667747a13ace48dc9cec9fffe305" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.435661 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.440817 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dshxz"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.450142 4841 scope.go:117] "RemoveContainer" containerID="26a82e702d2a3c83881d3fff2c701595576f7485a8cfe56dd0ebf32c96699ccd" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.467515 4841 scope.go:117] "RemoveContainer" containerID="c94dcdd6c11ee6a14ed9505b6a249e47937a04dab5188ebdff788d2b54377bcc" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.479392 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.483928 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-glndx"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.484133 4841 scope.go:117] "RemoveContainer" containerID="bb428106c4a2a9973d08cf70ac3738908c6d8a54c375731eb45f0a2fd8adab84" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.487642 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.491435 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-p6dg7"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.498124 4841 scope.go:117] "RemoveContainer" containerID="95bf333d16833b5f03b97023a7cea0639395386b974f669ae232f22bafff3153" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.508683 4841 scope.go:117] "RemoveContainer" containerID="ff00c6f334f026ce9d777ebc4c69728dbcd264c03f8734e4abd9a21ad0ed7d28" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.521744 4841 scope.go:117] "RemoveContainer" containerID="ab650f83cbf0e0eb180a8e8a63165c5ced7ed82102d06334445585569117035a" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.537161 4841 scope.go:117] "RemoveContainer" containerID="cb59915c8d4605ce62f632d28f71719c3731f6b76dd27d2578a4b8aa137fa82a" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.548582 4841 scope.go:117] "RemoveContainer" containerID="3f8e2c9167ae16077eab874a8a0c00dadcb95f3fc02d4427ebabdd11d585f04f" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704316 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-njclr"] Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704489 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704501 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704511 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704516 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704527 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704534 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704541 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704547 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704557 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704562 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704571 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704579 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704587 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704593 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704600 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704606 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704614 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704620 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704627 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704633 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="extract-utilities" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704640 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704654 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704661 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="extract-content" Dec 03 17:06:09 crc kubenswrapper[4841]: E1203 17:06:09.704670 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704676 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704749 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="68909a3d-4731-4851-a511-0b66e05d3741" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704759 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704770 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704777 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" containerName="marketplace-operator" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.704785 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" containerName="registry-server" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.705138 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.726119 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-njclr"] Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.807762 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808119 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-bound-sa-token\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5crt\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-kube-api-access-l5crt\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-trusted-ca\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808387 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc63784-673a-4a5b-b710-89eac43ed84b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808485 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-tls\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808572 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc63784-673a-4a5b-b710-89eac43ed84b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.808651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-certificates\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.824826 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909656 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-tls\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909722 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc63784-673a-4a5b-b710-89eac43ed84b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909743 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-certificates\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-bound-sa-token\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909815 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5crt\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-kube-api-access-l5crt\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909840 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc63784-673a-4a5b-b710-89eac43ed84b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.909854 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-trusted-ca\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.910691 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1cc63784-673a-4a5b-b710-89eac43ed84b-ca-trust-extracted\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.911280 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-trusted-ca\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.911378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-certificates\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.914839 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-registry-tls\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.914951 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1cc63784-673a-4a5b-b710-89eac43ed84b-installation-pull-secrets\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.926020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5crt\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-kube-api-access-l5crt\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:09 crc kubenswrapper[4841]: I1203 17:06:09.926655 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1cc63784-673a-4a5b-b710-89eac43ed84b-bound-sa-token\") pod \"image-registry-66df7c8f76-njclr\" (UID: \"1cc63784-673a-4a5b-b710-89eac43ed84b\") " pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.021184 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.074031 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4wkq7"] Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.078224 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.081272 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.084830 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wkq7"] Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.215055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-utilities\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.215149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txz8\" (UniqueName: \"kubernetes.io/projected/c7f4062c-74e4-424e-9508-2b16e8788201-kube-api-access-2txz8\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.215323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-catalog-content\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.246010 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0648214d-f0e4-4a8d-8f0d-2c3751c8e369" path="/var/lib/kubelet/pods/0648214d-f0e4-4a8d-8f0d-2c3751c8e369/volumes" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.246680 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68909a3d-4731-4851-a511-0b66e05d3741" path="/var/lib/kubelet/pods/68909a3d-4731-4851-a511-0b66e05d3741/volumes" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.247241 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73b4070c-62dc-49b6-b2fe-8ae468318da3" path="/var/lib/kubelet/pods/73b4070c-62dc-49b6-b2fe-8ae468318da3/volumes" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.248195 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="958ead90-9bd3-4c1b-b9e5-21378ecff345" path="/var/lib/kubelet/pods/958ead90-9bd3-4c1b-b9e5-21378ecff345/volumes" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.248716 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9594e34-3c26-4c76-b090-c9d8218398a6" path="/var/lib/kubelet/pods/f9594e34-3c26-4c76-b090-c9d8218398a6/volumes" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.317261 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-catalog-content\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.317994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-catalog-content\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.318053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-utilities\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.318183 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txz8\" (UniqueName: \"kubernetes.io/projected/c7f4062c-74e4-424e-9508-2b16e8788201-kube-api-access-2txz8\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.318226 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7f4062c-74e4-424e-9508-2b16e8788201-utilities\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.343661 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txz8\" (UniqueName: \"kubernetes.io/projected/c7f4062c-74e4-424e-9508-2b16e8788201-kube-api-access-2txz8\") pod \"redhat-marketplace-4wkq7\" (UID: \"c7f4062c-74e4-424e-9508-2b16e8788201\") " pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.402251 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.441792 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-njclr"] Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.659232 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2kk66"] Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.660489 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.662675 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.667011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kk66"] Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.723244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-utilities\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.723371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-catalog-content\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.723422 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvvf6\" (UniqueName: \"kubernetes.io/projected/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-kube-api-access-kvvf6\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.811342 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4wkq7"] Dec 03 17:06:10 crc kubenswrapper[4841]: W1203 17:06:10.814989 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f4062c_74e4_424e_9508_2b16e8788201.slice/crio-615626072c163272e93d9dbba753c60cd62ea57e8d35160461d71938cdeec583 WatchSource:0}: Error finding container 615626072c163272e93d9dbba753c60cd62ea57e8d35160461d71938cdeec583: Status 404 returned error can't find the container with id 615626072c163272e93d9dbba753c60cd62ea57e8d35160461d71938cdeec583 Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.824154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvvf6\" (UniqueName: \"kubernetes.io/projected/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-kube-api-access-kvvf6\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.824221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-utilities\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.824265 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-catalog-content\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.824674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-catalog-content\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.824745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-utilities\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.845750 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvvf6\" (UniqueName: \"kubernetes.io/projected/e06ffa98-bb06-47e5-ad3a-54d48e4886c8-kube-api-access-kvvf6\") pod \"certified-operators-2kk66\" (UID: \"e06ffa98-bb06-47e5-ad3a-54d48e4886c8\") " pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:10 crc kubenswrapper[4841]: I1203 17:06:10.973037 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:11 crc kubenswrapper[4841]: E1203 17:06:11.008392 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f4062c_74e4_424e_9508_2b16e8788201.slice/crio-cedb4ba64bbf6857db20ae54193af4aa06152e937ff1b833c112f55b31e3e89b.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7f4062c_74e4_424e_9508_2b16e8788201.slice/crio-conmon-cedb4ba64bbf6857db20ae54193af4aa06152e937ff1b833c112f55b31e3e89b.scope\": RecentStats: unable to find data in memory cache]" Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.356142 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2kk66"] Dec 03 17:06:11 crc kubenswrapper[4841]: W1203 17:06:11.362188 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode06ffa98_bb06_47e5_ad3a_54d48e4886c8.slice/crio-109ff0452603779e2192e69b73f6aabc5453654140f97575675f744abd1058d8 WatchSource:0}: Error finding container 109ff0452603779e2192e69b73f6aabc5453654140f97575675f744abd1058d8: Status 404 returned error can't find the container with id 109ff0452603779e2192e69b73f6aabc5453654140f97575675f744abd1058d8 Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.363738 4841 generic.go:334] "Generic (PLEG): container finished" podID="c7f4062c-74e4-424e-9508-2b16e8788201" containerID="cedb4ba64bbf6857db20ae54193af4aa06152e937ff1b833c112f55b31e3e89b" exitCode=0 Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.363805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wkq7" event={"ID":"c7f4062c-74e4-424e-9508-2b16e8788201","Type":"ContainerDied","Data":"cedb4ba64bbf6857db20ae54193af4aa06152e937ff1b833c112f55b31e3e89b"} Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.363830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wkq7" event={"ID":"c7f4062c-74e4-424e-9508-2b16e8788201","Type":"ContainerStarted","Data":"615626072c163272e93d9dbba753c60cd62ea57e8d35160461d71938cdeec583"} Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.366064 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" event={"ID":"1cc63784-673a-4a5b-b710-89eac43ed84b","Type":"ContainerStarted","Data":"148cb7b2b1f0c08bd292480ab8a88830b6115be1b2bc4dd194903831c8ae1720"} Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.366087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" event={"ID":"1cc63784-673a-4a5b-b710-89eac43ed84b","Type":"ContainerStarted","Data":"846b381e8dd210da07b65b186a41644f550d58a55597e185bd95ca5b47c4d9b0"} Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.366127 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:11 crc kubenswrapper[4841]: I1203 17:06:11.405496 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" podStartSLOduration=2.40547488 podStartE2EDuration="2.40547488s" podCreationTimestamp="2025-12-03 17:06:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:06:11.402488911 +0000 UTC m=+365.790009638" watchObservedRunningTime="2025-12-03 17:06:11.40547488 +0000 UTC m=+365.792995607" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.372937 4841 generic.go:334] "Generic (PLEG): container finished" podID="c7f4062c-74e4-424e-9508-2b16e8788201" containerID="496518fc67aee1356366fdff45124237f6c8b53d9d0928969747740ba3ee4cd4" exitCode=0 Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.373029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wkq7" event={"ID":"c7f4062c-74e4-424e-9508-2b16e8788201","Type":"ContainerDied","Data":"496518fc67aee1356366fdff45124237f6c8b53d9d0928969747740ba3ee4cd4"} Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.377741 4841 generic.go:334] "Generic (PLEG): container finished" podID="e06ffa98-bb06-47e5-ad3a-54d48e4886c8" containerID="4a8f74e44ab6f30279ec95422be6020acb3effff0c94806e01a6bccbac0018f0" exitCode=0 Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.377807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kk66" event={"ID":"e06ffa98-bb06-47e5-ad3a-54d48e4886c8","Type":"ContainerDied","Data":"4a8f74e44ab6f30279ec95422be6020acb3effff0c94806e01a6bccbac0018f0"} Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.377843 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kk66" event={"ID":"e06ffa98-bb06-47e5-ad3a-54d48e4886c8","Type":"ContainerStarted","Data":"109ff0452603779e2192e69b73f6aabc5453654140f97575675f744abd1058d8"} Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.463986 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f8fk5"] Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.466969 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.468825 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.473949 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f8fk5"] Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.547894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-catalog-content\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.547962 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm27r\" (UniqueName: \"kubernetes.io/projected/11cfe39b-967a-4099-bd85-414e09e2fc18-kube-api-access-qm27r\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.547991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-utilities\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.649420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-catalog-content\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.649496 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm27r\" (UniqueName: \"kubernetes.io/projected/11cfe39b-967a-4099-bd85-414e09e2fc18-kube-api-access-qm27r\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.649538 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-utilities\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.650240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-utilities\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.650259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11cfe39b-967a-4099-bd85-414e09e2fc18-catalog-content\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.671601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm27r\" (UniqueName: \"kubernetes.io/projected/11cfe39b-967a-4099-bd85-414e09e2fc18-kube-api-access-qm27r\") pod \"redhat-operators-f8fk5\" (UID: \"11cfe39b-967a-4099-bd85-414e09e2fc18\") " pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:12 crc kubenswrapper[4841]: I1203 17:06:12.793565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.065777 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rwffh"] Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.067523 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.069273 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.077040 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwffh"] Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.127097 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f8fk5"] Dec 03 17:06:13 crc kubenswrapper[4841]: W1203 17:06:13.133796 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11cfe39b_967a_4099_bd85_414e09e2fc18.slice/crio-5f19b17d7d5559aa951258c4aa9212a0ae5f6b9e021668008edd2ee044ae873b WatchSource:0}: Error finding container 5f19b17d7d5559aa951258c4aa9212a0ae5f6b9e021668008edd2ee044ae873b: Status 404 returned error can't find the container with id 5f19b17d7d5559aa951258c4aa9212a0ae5f6b9e021668008edd2ee044ae873b Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.156562 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-utilities\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.156647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnthn\" (UniqueName: \"kubernetes.io/projected/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-kube-api-access-lnthn\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.156671 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-catalog-content\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.258758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-utilities\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.259301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-utilities\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.266437 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnthn\" (UniqueName: \"kubernetes.io/projected/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-kube-api-access-lnthn\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.266524 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-catalog-content\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.268173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-catalog-content\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.295218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnthn\" (UniqueName: \"kubernetes.io/projected/190f7b14-18cf-4fb0-bfdf-fa21a8ded991-kube-api-access-lnthn\") pod \"community-operators-rwffh\" (UID: \"190f7b14-18cf-4fb0-bfdf-fa21a8ded991\") " pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.386172 4841 generic.go:334] "Generic (PLEG): container finished" podID="11cfe39b-967a-4099-bd85-414e09e2fc18" containerID="ed27622821c1a824176ae3d1b0e9268cbb60bfd79d79521fe5a590639c91e1cf" exitCode=0 Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.386225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8fk5" event={"ID":"11cfe39b-967a-4099-bd85-414e09e2fc18","Type":"ContainerDied","Data":"ed27622821c1a824176ae3d1b0e9268cbb60bfd79d79521fe5a590639c91e1cf"} Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.386252 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8fk5" event={"ID":"11cfe39b-967a-4099-bd85-414e09e2fc18","Type":"ContainerStarted","Data":"5f19b17d7d5559aa951258c4aa9212a0ae5f6b9e021668008edd2ee044ae873b"} Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.392685 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4wkq7" event={"ID":"c7f4062c-74e4-424e-9508-2b16e8788201","Type":"ContainerStarted","Data":"8beac3b87a1851b9d5ef866669c17e6c13b4273c061c33c7386f50dea64f3249"} Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.404015 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.428684 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4wkq7" podStartSLOduration=1.928021514 podStartE2EDuration="3.428666634s" podCreationTimestamp="2025-12-03 17:06:10 +0000 UTC" firstStartedPulling="2025-12-03 17:06:11.37006606 +0000 UTC m=+365.757586787" lastFinishedPulling="2025-12-03 17:06:12.87071118 +0000 UTC m=+367.258231907" observedRunningTime="2025-12-03 17:06:13.425814158 +0000 UTC m=+367.813334885" watchObservedRunningTime="2025-12-03 17:06:13.428666634 +0000 UTC m=+367.816187361" Dec 03 17:06:13 crc kubenswrapper[4841]: I1203 17:06:13.814277 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rwffh"] Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.399308 4841 generic.go:334] "Generic (PLEG): container finished" podID="190f7b14-18cf-4fb0-bfdf-fa21a8ded991" containerID="b043cc16c2ed497e9246c322af28a5d67c35e2cd2fc1e18f3f263528d78eab1e" exitCode=0 Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.399709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwffh" event={"ID":"190f7b14-18cf-4fb0-bfdf-fa21a8ded991","Type":"ContainerDied","Data":"b043cc16c2ed497e9246c322af28a5d67c35e2cd2fc1e18f3f263528d78eab1e"} Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.399744 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwffh" event={"ID":"190f7b14-18cf-4fb0-bfdf-fa21a8ded991","Type":"ContainerStarted","Data":"caa346d67cf4958c6eb738409336a43068bb8b36436abdffc08538c58d207819"} Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.402152 4841 generic.go:334] "Generic (PLEG): container finished" podID="e06ffa98-bb06-47e5-ad3a-54d48e4886c8" containerID="5bf8bde8a586c04df3a14df2194fa49cd29bbbdc9babb768fdcfe7f5a909d6a6" exitCode=0 Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.402284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kk66" event={"ID":"e06ffa98-bb06-47e5-ad3a-54d48e4886c8","Type":"ContainerDied","Data":"5bf8bde8a586c04df3a14df2194fa49cd29bbbdc9babb768fdcfe7f5a909d6a6"} Dec 03 17:06:14 crc kubenswrapper[4841]: I1203 17:06:14.406152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8fk5" event={"ID":"11cfe39b-967a-4099-bd85-414e09e2fc18","Type":"ContainerStarted","Data":"65fc60ab930d2b5279310d538bce35245f0f2c5820ddfe048e952771a5db77bf"} Dec 03 17:06:15 crc kubenswrapper[4841]: I1203 17:06:15.412776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2kk66" event={"ID":"e06ffa98-bb06-47e5-ad3a-54d48e4886c8","Type":"ContainerStarted","Data":"3a1def23e31b93ab680441e2dede1dac23cdef9bf3ab5525beb626ba984d0d36"} Dec 03 17:06:15 crc kubenswrapper[4841]: I1203 17:06:15.414418 4841 generic.go:334] "Generic (PLEG): container finished" podID="11cfe39b-967a-4099-bd85-414e09e2fc18" containerID="65fc60ab930d2b5279310d538bce35245f0f2c5820ddfe048e952771a5db77bf" exitCode=0 Dec 03 17:06:15 crc kubenswrapper[4841]: I1203 17:06:15.414443 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8fk5" event={"ID":"11cfe39b-967a-4099-bd85-414e09e2fc18","Type":"ContainerDied","Data":"65fc60ab930d2b5279310d538bce35245f0f2c5820ddfe048e952771a5db77bf"} Dec 03 17:06:15 crc kubenswrapper[4841]: I1203 17:06:15.416260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwffh" event={"ID":"190f7b14-18cf-4fb0-bfdf-fa21a8ded991","Type":"ContainerStarted","Data":"4574f8301b559b04fa2f14789129072bf321990f476c7210a76c2a4029e7b4a7"} Dec 03 17:06:15 crc kubenswrapper[4841]: I1203 17:06:15.437209 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2kk66" podStartSLOduration=3.021322438 podStartE2EDuration="5.437192968s" podCreationTimestamp="2025-12-03 17:06:10 +0000 UTC" firstStartedPulling="2025-12-03 17:06:12.379636623 +0000 UTC m=+366.767157340" lastFinishedPulling="2025-12-03 17:06:14.795507123 +0000 UTC m=+369.183027870" observedRunningTime="2025-12-03 17:06:15.434392384 +0000 UTC m=+369.821913111" watchObservedRunningTime="2025-12-03 17:06:15.437192968 +0000 UTC m=+369.824713695" Dec 03 17:06:16 crc kubenswrapper[4841]: I1203 17:06:16.422996 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f8fk5" event={"ID":"11cfe39b-967a-4099-bd85-414e09e2fc18","Type":"ContainerStarted","Data":"5351437e9cc1bc05d614dc5c85b09dcc56488bda1df755f02b96d816664524f9"} Dec 03 17:06:16 crc kubenswrapper[4841]: I1203 17:06:16.424547 4841 generic.go:334] "Generic (PLEG): container finished" podID="190f7b14-18cf-4fb0-bfdf-fa21a8ded991" containerID="4574f8301b559b04fa2f14789129072bf321990f476c7210a76c2a4029e7b4a7" exitCode=0 Dec 03 17:06:16 crc kubenswrapper[4841]: I1203 17:06:16.425099 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwffh" event={"ID":"190f7b14-18cf-4fb0-bfdf-fa21a8ded991","Type":"ContainerDied","Data":"4574f8301b559b04fa2f14789129072bf321990f476c7210a76c2a4029e7b4a7"} Dec 03 17:06:16 crc kubenswrapper[4841]: I1203 17:06:16.441186 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f8fk5" podStartSLOduration=2.017114046 podStartE2EDuration="4.441162363s" podCreationTimestamp="2025-12-03 17:06:12 +0000 UTC" firstStartedPulling="2025-12-03 17:06:13.387507111 +0000 UTC m=+367.775027838" lastFinishedPulling="2025-12-03 17:06:15.811555428 +0000 UTC m=+370.199076155" observedRunningTime="2025-12-03 17:06:16.439080498 +0000 UTC m=+370.826601265" watchObservedRunningTime="2025-12-03 17:06:16.441162363 +0000 UTC m=+370.828683120" Dec 03 17:06:18 crc kubenswrapper[4841]: I1203 17:06:18.439638 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rwffh" event={"ID":"190f7b14-18cf-4fb0-bfdf-fa21a8ded991","Type":"ContainerStarted","Data":"3d696350cb6f3c7ed1e038383c8b73bd013ea548dd13a8ab9453446df0b99d88"} Dec 03 17:06:18 crc kubenswrapper[4841]: I1203 17:06:18.465937 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rwffh" podStartSLOduration=3.031071556 podStartE2EDuration="5.465885319s" podCreationTimestamp="2025-12-03 17:06:13 +0000 UTC" firstStartedPulling="2025-12-03 17:06:14.401992055 +0000 UTC m=+368.789512782" lastFinishedPulling="2025-12-03 17:06:16.836805808 +0000 UTC m=+371.224326545" observedRunningTime="2025-12-03 17:06:18.465638962 +0000 UTC m=+372.853159689" watchObservedRunningTime="2025-12-03 17:06:18.465885319 +0000 UTC m=+372.853406086" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.403183 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.403345 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.448787 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.494214 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4wkq7" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.974300 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:20 crc kubenswrapper[4841]: I1203 17:06:20.974355 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:21 crc kubenswrapper[4841]: I1203 17:06:21.012827 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:21 crc kubenswrapper[4841]: I1203 17:06:21.495539 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2kk66" Dec 03 17:06:22 crc kubenswrapper[4841]: I1203 17:06:22.795024 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:22 crc kubenswrapper[4841]: I1203 17:06:22.795347 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:22 crc kubenswrapper[4841]: I1203 17:06:22.831448 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:23 crc kubenswrapper[4841]: I1203 17:06:23.403152 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:23 crc kubenswrapper[4841]: I1203 17:06:23.403217 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:23 crc kubenswrapper[4841]: I1203 17:06:23.438209 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:23 crc kubenswrapper[4841]: I1203 17:06:23.507228 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f8fk5" Dec 03 17:06:23 crc kubenswrapper[4841]: I1203 17:06:23.507660 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rwffh" Dec 03 17:06:30 crc kubenswrapper[4841]: I1203 17:06:30.029057 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-njclr" Dec 03 17:06:30 crc kubenswrapper[4841]: I1203 17:06:30.087742 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:06:30 crc kubenswrapper[4841]: I1203 17:06:30.835549 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:06:30 crc kubenswrapper[4841]: I1203 17:06:30.836114 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" containerID="cri-o://c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd" gracePeriod=30 Dec 03 17:06:32 crc kubenswrapper[4841]: I1203 17:06:32.488759 4841 patch_prober.go:28] interesting pod/route-controller-manager-7bf9495b6c-z99rx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Dec 03 17:06:32 crc kubenswrapper[4841]: I1203 17:06:32.489226 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.105814 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.134806 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm"] Dec 03 17:06:33 crc kubenswrapper[4841]: E1203 17:06:33.135059 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.135075 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.135218 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" containerName="route-controller-manager" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.135568 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.159800 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm"] Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241450 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca\") pod \"064467ef-c3e0-4219-936c-9f0d581b065d\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241588 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert\") pod \"064467ef-c3e0-4219-936c-9f0d581b065d\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241622 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhc4l\" (UniqueName: \"kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l\") pod \"064467ef-c3e0-4219-936c-9f0d581b065d\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241655 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config\") pod \"064467ef-c3e0-4219-936c-9f0d581b065d\" (UID: \"064467ef-c3e0-4219-936c-9f0d581b065d\") " Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-serving-cert\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241865 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j8nj\" (UniqueName: \"kubernetes.io/projected/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-kube-api-access-5j8nj\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.241960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-config\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.242097 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-client-ca\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.242724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca" (OuterVolumeSpecName: "client-ca") pod "064467ef-c3e0-4219-936c-9f0d581b065d" (UID: "064467ef-c3e0-4219-936c-9f0d581b065d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.242862 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config" (OuterVolumeSpecName: "config") pod "064467ef-c3e0-4219-936c-9f0d581b065d" (UID: "064467ef-c3e0-4219-936c-9f0d581b065d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.250220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l" (OuterVolumeSpecName: "kube-api-access-mhc4l") pod "064467ef-c3e0-4219-936c-9f0d581b065d" (UID: "064467ef-c3e0-4219-936c-9f0d581b065d"). InnerVolumeSpecName "kube-api-access-mhc4l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.250361 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "064467ef-c3e0-4219-936c-9f0d581b065d" (UID: "064467ef-c3e0-4219-936c-9f0d581b065d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.343641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j8nj\" (UniqueName: \"kubernetes.io/projected/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-kube-api-access-5j8nj\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.346996 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-config\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-config\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-client-ca\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-serving-cert\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347462 4841 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/064467ef-c3e0-4219-936c-9f0d581b065d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347499 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhc4l\" (UniqueName: \"kubernetes.io/projected/064467ef-c3e0-4219-936c-9f0d581b065d-kube-api-access-mhc4l\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347527 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.347550 4841 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/064467ef-c3e0-4219-936c-9f0d581b065d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.349503 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-client-ca\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.360027 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-serving-cert\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.367804 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j8nj\" (UniqueName: \"kubernetes.io/projected/ee2b72a6-5bb4-44e1-8b9d-85c0976938e6-kube-api-access-5j8nj\") pod \"route-controller-manager-869fdbb5ff-bq2vm\" (UID: \"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6\") " pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.454891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.542830 4841 generic.go:334] "Generic (PLEG): container finished" podID="064467ef-c3e0-4219-936c-9f0d581b065d" containerID="c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd" exitCode=0 Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.542952 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.542986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" event={"ID":"064467ef-c3e0-4219-936c-9f0d581b065d","Type":"ContainerDied","Data":"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd"} Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.544233 4841 scope.go:117] "RemoveContainer" containerID="c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.544421 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx" event={"ID":"064467ef-c3e0-4219-936c-9f0d581b065d","Type":"ContainerDied","Data":"d18c288ac6e3663a38b575bd8bcf0da0d066a9696e41bfd12ee05b56d0a47484"} Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.570394 4841 scope.go:117] "RemoveContainer" containerID="c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd" Dec 03 17:06:33 crc kubenswrapper[4841]: E1203 17:06:33.570843 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd\": container with ID starting with c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd not found: ID does not exist" containerID="c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.570887 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd"} err="failed to get container status \"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd\": rpc error: code = NotFound desc = could not find container \"c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd\": container with ID starting with c59532452cbe0fecb61224b24292c52d48e78245cf6418be19a1d766310eedcd not found: ID does not exist" Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.596064 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.600640 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7bf9495b6c-z99rx"] Dec 03 17:06:33 crc kubenswrapper[4841]: I1203 17:06:33.705351 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm"] Dec 03 17:06:33 crc kubenswrapper[4841]: W1203 17:06:33.710709 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2b72a6_5bb4_44e1_8b9d_85c0976938e6.slice/crio-da843e87209a549847700b3c1d5f36a6d21235ab6e2b1785d1f7195312439cca WatchSource:0}: Error finding container da843e87209a549847700b3c1d5f36a6d21235ab6e2b1785d1f7195312439cca: Status 404 returned error can't find the container with id da843e87209a549847700b3c1d5f36a6d21235ab6e2b1785d1f7195312439cca Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.244830 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="064467ef-c3e0-4219-936c-9f0d581b065d" path="/var/lib/kubelet/pods/064467ef-c3e0-4219-936c-9f0d581b065d/volumes" Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.550299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" event={"ID":"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6","Type":"ContainerStarted","Data":"212d75bc7d1e6275cfd87e4a964bc5351cee440a82c16b6c2ef00f374ed283b5"} Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.550342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" event={"ID":"ee2b72a6-5bb4-44e1-8b9d-85c0976938e6","Type":"ContainerStarted","Data":"da843e87209a549847700b3c1d5f36a6d21235ab6e2b1785d1f7195312439cca"} Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.550557 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.557040 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" Dec 03 17:06:34 crc kubenswrapper[4841]: I1203 17:06:34.567826 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-869fdbb5ff-bq2vm" podStartSLOduration=4.5678085920000004 podStartE2EDuration="4.567808592s" podCreationTimestamp="2025-12-03 17:06:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:06:34.56548509 +0000 UTC m=+388.953005817" watchObservedRunningTime="2025-12-03 17:06:34.567808592 +0000 UTC m=+388.955329319" Dec 03 17:06:39 crc kubenswrapper[4841]: I1203 17:06:39.316710 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:06:39 crc kubenswrapper[4841]: I1203 17:06:39.317169 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.144840 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" podUID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" containerName="registry" containerID="cri-o://6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10" gracePeriod=30 Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.598789 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.701620 4841 generic.go:334] "Generic (PLEG): container finished" podID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" containerID="6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10" exitCode=0 Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.701680 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" event={"ID":"3c61a910-e9a4-4f77-a5d4-56e760ed1394","Type":"ContainerDied","Data":"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10"} Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.701711 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.701739 4841 scope.go:117] "RemoveContainer" containerID="6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.701723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-mtwkx" event={"ID":"3c61a910-e9a4-4f77-a5d4-56e760ed1394","Type":"ContainerDied","Data":"c5c2ffe93b02ec22e76079ca0477828d217d2b82e6b2c8539aa561ac532308a4"} Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.723976 4841 scope.go:117] "RemoveContainer" containerID="6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10" Dec 03 17:06:55 crc kubenswrapper[4841]: E1203 17:06:55.724506 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10\": container with ID starting with 6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10 not found: ID does not exist" containerID="6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.724588 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10"} err="failed to get container status \"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10\": rpc error: code = NotFound desc = could not find container \"6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10\": container with ID starting with 6d35df69b63edecfae4aa4fe0f7b2b11f729211208a4ba6c75777ebb878ead10 not found: ID does not exist" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.730574 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bjf8\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.730650 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.730982 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.731268 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.731402 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.731590 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.731810 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.731971 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets\") pod \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\" (UID: \"3c61a910-e9a4-4f77-a5d4-56e760ed1394\") " Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.732360 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.732561 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.739079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.739721 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.739736 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.740294 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8" (OuterVolumeSpecName: "kube-api-access-6bjf8") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "kube-api-access-6bjf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.744134 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.754443 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "3c61a910-e9a4-4f77-a5d4-56e760ed1394" (UID: "3c61a910-e9a4-4f77-a5d4-56e760ed1394"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833693 4841 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/3c61a910-e9a4-4f77-a5d4-56e760ed1394-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833733 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bjf8\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-kube-api-access-6bjf8\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833750 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833765 4841 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833779 4841 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833792 4841 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/3c61a910-e9a4-4f77-a5d4-56e760ed1394-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:55 crc kubenswrapper[4841]: I1203 17:06:55.833804 4841 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3c61a910-e9a4-4f77-a5d4-56e760ed1394-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 17:06:56 crc kubenswrapper[4841]: I1203 17:06:56.040252 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:06:56 crc kubenswrapper[4841]: I1203 17:06:56.042423 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-mtwkx"] Dec 03 17:06:56 crc kubenswrapper[4841]: I1203 17:06:56.246257 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" path="/var/lib/kubelet/pods/3c61a910-e9a4-4f77-a5d4-56e760ed1394/volumes" Dec 03 17:07:09 crc kubenswrapper[4841]: I1203 17:07:09.316968 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:07:09 crc kubenswrapper[4841]: I1203 17:07:09.317631 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:07:09 crc kubenswrapper[4841]: I1203 17:07:09.317696 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:07:09 crc kubenswrapper[4841]: I1203 17:07:09.318605 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:07:09 crc kubenswrapper[4841]: I1203 17:07:09.318711 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c" gracePeriod=600 Dec 03 17:07:10 crc kubenswrapper[4841]: I1203 17:07:10.806290 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c" exitCode=0 Dec 03 17:07:10 crc kubenswrapper[4841]: I1203 17:07:10.806375 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c"} Dec 03 17:07:10 crc kubenswrapper[4841]: I1203 17:07:10.806931 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403"} Dec 03 17:07:10 crc kubenswrapper[4841]: I1203 17:07:10.806961 4841 scope.go:117] "RemoveContainer" containerID="ce6869bf4ae433ac2a3fa26a8162c28292f2835e189291cd904cd8025d257014" Dec 03 17:09:39 crc kubenswrapper[4841]: I1203 17:09:39.316603 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:09:39 crc kubenswrapper[4841]: I1203 17:09:39.317037 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:10:09 crc kubenswrapper[4841]: I1203 17:10:09.316358 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:10:09 crc kubenswrapper[4841]: I1203 17:10:09.318683 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:10:39 crc kubenswrapper[4841]: I1203 17:10:39.317416 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:10:39 crc kubenswrapper[4841]: I1203 17:10:39.318098 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:10:39 crc kubenswrapper[4841]: I1203 17:10:39.318159 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:10:39 crc kubenswrapper[4841]: I1203 17:10:39.318734 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:10:39 crc kubenswrapper[4841]: I1203 17:10:39.318799 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403" gracePeriod=600 Dec 03 17:10:40 crc kubenswrapper[4841]: I1203 17:10:40.159628 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403" exitCode=0 Dec 03 17:10:40 crc kubenswrapper[4841]: I1203 17:10:40.159729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403"} Dec 03 17:10:40 crc kubenswrapper[4841]: I1203 17:10:40.160120 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03"} Dec 03 17:10:40 crc kubenswrapper[4841]: I1203 17:10:40.160160 4841 scope.go:117] "RemoveContainer" containerID="daa18669936eceb48d01d308236c99f78a8e0296b021db0d3c986e026fa5670c" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.046493 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pmzsv"] Dec 03 17:11:31 crc kubenswrapper[4841]: E1203 17:11:31.047670 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" containerName="registry" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.047688 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" containerName="registry" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.047869 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c61a910-e9a4-4f77-a5d4-56e760ed1394" containerName="registry" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.048377 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.049961 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-l8d6c" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.052093 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.052227 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.063335 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qhbrc"] Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.067900 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qhbrc" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.075236 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2c5w9"] Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.077241 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.081567 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-566mk" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.081576 4841 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-fdtzt" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.085957 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pmzsv"] Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.097154 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2c5w9"] Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.104648 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qhbrc"] Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.153648 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kwjs\" (UniqueName: \"kubernetes.io/projected/ae7ac6c5-8af0-40d3-9b0b-9009819f439d-kube-api-access-2kwjs\") pod \"cert-manager-cainjector-7f985d654d-pmzsv\" (UID: \"ae7ac6c5-8af0-40d3-9b0b-9009819f439d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.255043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2kwjs\" (UniqueName: \"kubernetes.io/projected/ae7ac6c5-8af0-40d3-9b0b-9009819f439d-kube-api-access-2kwjs\") pod \"cert-manager-cainjector-7f985d654d-pmzsv\" (UID: \"ae7ac6c5-8af0-40d3-9b0b-9009819f439d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.255115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54tqc\" (UniqueName: \"kubernetes.io/projected/b6111350-39b6-4228-a2ac-3cc25ad33c50-kube-api-access-54tqc\") pod \"cert-manager-5b446d88c5-qhbrc\" (UID: \"b6111350-39b6-4228-a2ac-3cc25ad33c50\") " pod="cert-manager/cert-manager-5b446d88c5-qhbrc" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.255158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfqr9\" (UniqueName: \"kubernetes.io/projected/6ea86ddf-89eb-471c-b9f5-1fef42cd94cd-kube-api-access-pfqr9\") pod \"cert-manager-webhook-5655c58dd6-2c5w9\" (UID: \"6ea86ddf-89eb-471c-b9f5-1fef42cd94cd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.285429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kwjs\" (UniqueName: \"kubernetes.io/projected/ae7ac6c5-8af0-40d3-9b0b-9009819f439d-kube-api-access-2kwjs\") pod \"cert-manager-cainjector-7f985d654d-pmzsv\" (UID: \"ae7ac6c5-8af0-40d3-9b0b-9009819f439d\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.356700 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfqr9\" (UniqueName: \"kubernetes.io/projected/6ea86ddf-89eb-471c-b9f5-1fef42cd94cd-kube-api-access-pfqr9\") pod \"cert-manager-webhook-5655c58dd6-2c5w9\" (UID: \"6ea86ddf-89eb-471c-b9f5-1fef42cd94cd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.357218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54tqc\" (UniqueName: \"kubernetes.io/projected/b6111350-39b6-4228-a2ac-3cc25ad33c50-kube-api-access-54tqc\") pod \"cert-manager-5b446d88c5-qhbrc\" (UID: \"b6111350-39b6-4228-a2ac-3cc25ad33c50\") " pod="cert-manager/cert-manager-5b446d88c5-qhbrc" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.377279 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfqr9\" (UniqueName: \"kubernetes.io/projected/6ea86ddf-89eb-471c-b9f5-1fef42cd94cd-kube-api-access-pfqr9\") pod \"cert-manager-webhook-5655c58dd6-2c5w9\" (UID: \"6ea86ddf-89eb-471c-b9f5-1fef42cd94cd\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.386888 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54tqc\" (UniqueName: \"kubernetes.io/projected/b6111350-39b6-4228-a2ac-3cc25ad33c50-kube-api-access-54tqc\") pod \"cert-manager-5b446d88c5-qhbrc\" (UID: \"b6111350-39b6-4228-a2ac-3cc25ad33c50\") " pod="cert-manager/cert-manager-5b446d88c5-qhbrc" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.387008 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.395671 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-qhbrc" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.409229 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.611844 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-pmzsv"] Dec 03 17:11:31 crc kubenswrapper[4841]: W1203 17:11:31.617292 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae7ac6c5_8af0_40d3_9b0b_9009819f439d.slice/crio-ee5d61e6306b11fb093fdfda6246fe0bb861a4f8c5e8689b9fc26610f384aa70 WatchSource:0}: Error finding container ee5d61e6306b11fb093fdfda6246fe0bb861a4f8c5e8689b9fc26610f384aa70: Status 404 returned error can't find the container with id ee5d61e6306b11fb093fdfda6246fe0bb861a4f8c5e8689b9fc26610f384aa70 Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.620710 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.662791 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-2c5w9"] Dec 03 17:11:31 crc kubenswrapper[4841]: W1203 17:11:31.668537 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea86ddf_89eb_471c_b9f5_1fef42cd94cd.slice/crio-30ac5006cc7e6defa96e70c7533a546299e8a8d45ddfca107ab53703888e1a03 WatchSource:0}: Error finding container 30ac5006cc7e6defa96e70c7533a546299e8a8d45ddfca107ab53703888e1a03: Status 404 returned error can't find the container with id 30ac5006cc7e6defa96e70c7533a546299e8a8d45ddfca107ab53703888e1a03 Dec 03 17:11:31 crc kubenswrapper[4841]: I1203 17:11:31.714258 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-qhbrc"] Dec 03 17:11:31 crc kubenswrapper[4841]: W1203 17:11:31.718792 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6111350_39b6_4228_a2ac_3cc25ad33c50.slice/crio-2f821655d3713a4a706a7b03e37d5b28a40c755c478baa86cdf7734a7f207dee WatchSource:0}: Error finding container 2f821655d3713a4a706a7b03e37d5b28a40c755c478baa86cdf7734a7f207dee: Status 404 returned error can't find the container with id 2f821655d3713a4a706a7b03e37d5b28a40c755c478baa86cdf7734a7f207dee Dec 03 17:11:32 crc kubenswrapper[4841]: I1203 17:11:32.535865 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qhbrc" event={"ID":"b6111350-39b6-4228-a2ac-3cc25ad33c50","Type":"ContainerStarted","Data":"2f821655d3713a4a706a7b03e37d5b28a40c755c478baa86cdf7734a7f207dee"} Dec 03 17:11:32 crc kubenswrapper[4841]: I1203 17:11:32.537538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" event={"ID":"6ea86ddf-89eb-471c-b9f5-1fef42cd94cd","Type":"ContainerStarted","Data":"30ac5006cc7e6defa96e70c7533a546299e8a8d45ddfca107ab53703888e1a03"} Dec 03 17:11:32 crc kubenswrapper[4841]: I1203 17:11:32.539154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" event={"ID":"ae7ac6c5-8af0-40d3-9b0b-9009819f439d","Type":"ContainerStarted","Data":"ee5d61e6306b11fb093fdfda6246fe0bb861a4f8c5e8689b9fc26610f384aa70"} Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.586545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-qhbrc" event={"ID":"b6111350-39b6-4228-a2ac-3cc25ad33c50","Type":"ContainerStarted","Data":"98d10ff3e9e12b93582579849acfeddcf24863e5628c3be9d084f8e345e6efa5"} Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.588962 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" event={"ID":"6ea86ddf-89eb-471c-b9f5-1fef42cd94cd","Type":"ContainerStarted","Data":"f7b159332aa8a0f7ae16f7a1e5ec0361d619aef262cf454dc1fd96236d4c5371"} Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.589410 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.591519 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" event={"ID":"ae7ac6c5-8af0-40d3-9b0b-9009819f439d","Type":"ContainerStarted","Data":"b8a3541cb37bc95f1ef52848dd834aa7656ceff70885c7af679ad30734dc8497"} Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.614803 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-qhbrc" podStartSLOduration=1.894235825 podStartE2EDuration="8.614758771s" podCreationTimestamp="2025-12-03 17:11:31 +0000 UTC" firstStartedPulling="2025-12-03 17:11:31.720605844 +0000 UTC m=+686.108126571" lastFinishedPulling="2025-12-03 17:11:38.44112878 +0000 UTC m=+692.828649517" observedRunningTime="2025-12-03 17:11:39.606532736 +0000 UTC m=+693.994053503" watchObservedRunningTime="2025-12-03 17:11:39.614758771 +0000 UTC m=+694.002279538" Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.649595 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-pmzsv" podStartSLOduration=1.824170805 podStartE2EDuration="8.649567259s" podCreationTimestamp="2025-12-03 17:11:31 +0000 UTC" firstStartedPulling="2025-12-03 17:11:31.620434849 +0000 UTC m=+686.007955576" lastFinishedPulling="2025-12-03 17:11:38.445831283 +0000 UTC m=+692.833352030" observedRunningTime="2025-12-03 17:11:39.638510191 +0000 UTC m=+694.026030958" watchObservedRunningTime="2025-12-03 17:11:39.649567259 +0000 UTC m=+694.037088026" Dec 03 17:11:39 crc kubenswrapper[4841]: I1203 17:11:39.670254 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" podStartSLOduration=1.910157242 podStartE2EDuration="8.670216489s" podCreationTimestamp="2025-12-03 17:11:31 +0000 UTC" firstStartedPulling="2025-12-03 17:11:31.67177947 +0000 UTC m=+686.059300197" lastFinishedPulling="2025-12-03 17:11:38.431838677 +0000 UTC m=+692.819359444" observedRunningTime="2025-12-03 17:11:39.662014624 +0000 UTC m=+694.049535381" watchObservedRunningTime="2025-12-03 17:11:39.670216489 +0000 UTC m=+694.057737256" Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.935533 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5svt"] Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936004 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="northd" containerID="cri-o://1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936097 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="sbdb" containerID="cri-o://1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936207 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-node" containerID="cri-o://de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936139 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="nbdb" containerID="cri-o://ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936297 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.936339 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-acl-logging" containerID="cri-o://0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.938003 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-controller" containerID="cri-o://daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" gracePeriod=30 Dec 03 17:11:41 crc kubenswrapper[4841]: I1203 17:11:41.978921 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" containerID="cri-o://170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" gracePeriod=30 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.202834 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/3.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.204999 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovn-acl-logging/0.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.205442 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovn-controller/0.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.206152 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256093 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-6nzl7"] Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256494 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256523 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256550 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256562 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256582 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256594 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256609 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256623 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256639 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-node" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256651 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-node" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256675 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="nbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256687 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="nbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256703 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kubecfg-setup" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256716 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kubecfg-setup" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256737 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="northd" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256754 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="northd" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256775 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="sbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256791 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="sbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256815 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-acl-logging" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256832 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-acl-logging" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256848 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256862 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.256878 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.256889 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257103 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="nbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257136 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257157 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257175 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovn-acl-logging" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257198 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="kube-rbac-proxy-node" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257224 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="northd" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257242 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257263 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257344 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="sbdb" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257364 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257379 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257392 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.257570 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.257601 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerName="ovnkube-controller" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.261864 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.400993 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401326 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401431 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401947 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402353 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402542 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402775 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xgj9j\" (UniqueName: \"kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403023 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403474 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404319 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404597 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401657 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.401897 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402005 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402396 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.402613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket" (OuterVolumeSpecName: "log-socket") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.403119 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404254 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404307 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404986 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404405 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405235 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404760 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404852 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.404954 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405303 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch\") pod \"1853b500-b218-4412-9cbc-9fd0a76778c0\" (UID: \"1853b500-b218-4412-9cbc-9fd0a76778c0\") " Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405322 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log" (OuterVolumeSpecName: "node-log") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405435 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405683 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-script-lib\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttnxg\" (UniqueName: \"kubernetes.io/projected/773ebca5-a623-4df5-94e2-76c5d15c9658-kube-api-access-ttnxg\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/773ebca5-a623-4df5-94e2-76c5d15c9658-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-bin\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405898 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-kubelet\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.405940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-log-socket\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406016 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-config\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406147 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-etc-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-systemd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406275 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406311 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-netns\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-ovn\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406459 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-node-log\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406565 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-slash\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406592 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406622 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-netd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406665 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-var-lib-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406793 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-env-overrides\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406896 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.406967 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-systemd-units\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407054 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407069 4841 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407083 4841 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407095 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407106 4841 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407116 4841 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407126 4841 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/1853b500-b218-4412-9cbc-9fd0a76778c0-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407138 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407148 4841 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407162 4841 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407175 4841 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407186 4841 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407199 4841 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407209 4841 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407220 4841 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.407230 4841 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.408738 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash" (OuterVolumeSpecName: "host-slash") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.411245 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.411993 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j" (OuterVolumeSpecName: "kube-api-access-xgj9j") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "kube-api-access-xgj9j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.415765 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "1853b500-b218-4412-9cbc-9fd0a76778c0" (UID: "1853b500-b218-4412-9cbc-9fd0a76778c0"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-log-socket\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-log-socket\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508188 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-config\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508307 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-etc-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-systemd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-netns\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508445 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-etc-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508508 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-ovn\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-node-log\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508603 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-ovn\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508627 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508650 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-netns\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508664 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-slash\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-node-log\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508703 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-netd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-run-systemd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-var-lib-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508757 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.508784 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-env-overrides\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509005 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509074 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-systemd-units\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509091 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-var-lib-openvswitch\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509075 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-slash\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-run-ovn-kubernetes\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-script-lib\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509188 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttnxg\" (UniqueName: \"kubernetes.io/projected/773ebca5-a623-4df5-94e2-76c5d15c9658-kube-api-access-ttnxg\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509235 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-systemd-units\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/773ebca5-a623-4df5-94e2-76c5d15c9658-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-bin\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509514 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-kubelet\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-bin\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509608 4841 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509632 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/1853b500-b218-4412-9cbc-9fd0a76778c0-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509654 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xgj9j\" (UniqueName: \"kubernetes.io/projected/1853b500-b218-4412-9cbc-9fd0a76778c0-kube-api-access-xgj9j\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509675 4841 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/1853b500-b218-4412-9cbc-9fd0a76778c0-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-kubelet\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.509870 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/773ebca5-a623-4df5-94e2-76c5d15c9658-host-cni-netd\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.510406 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-env-overrides\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.510713 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-script-lib\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.514855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/773ebca5-a623-4df5-94e2-76c5d15c9658-ovn-node-metrics-cert\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.512664 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/773ebca5-a623-4df5-94e2-76c5d15c9658-ovnkube-config\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.526038 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttnxg\" (UniqueName: \"kubernetes.io/projected/773ebca5-a623-4df5-94e2-76c5d15c9658-kube-api-access-ttnxg\") pod \"ovnkube-node-6nzl7\" (UID: \"773ebca5-a623-4df5-94e2-76c5d15c9658\") " pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.583409 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.611129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovnkube-controller/3.log" Dec 03 17:11:42 crc kubenswrapper[4841]: W1203 17:11:42.612445 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod773ebca5_a623_4df5_94e2_76c5d15c9658.slice/crio-c3cd727d240dea2667b290050ab0617d2203984826ca226db04e00d6fcaddc82 WatchSource:0}: Error finding container c3cd727d240dea2667b290050ab0617d2203984826ca226db04e00d6fcaddc82: Status 404 returned error can't find the container with id c3cd727d240dea2667b290050ab0617d2203984826ca226db04e00d6fcaddc82 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.614169 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovn-acl-logging/0.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.614772 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d5svt_1853b500-b218-4412-9cbc-9fd0a76778c0/ovn-controller/0.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615225 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615249 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615257 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615266 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615274 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615281 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" exitCode=0 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615289 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" exitCode=143 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615298 4841 generic.go:334] "Generic (PLEG): container finished" podID="1853b500-b218-4412-9cbc-9fd0a76778c0" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" exitCode=143 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615365 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615397 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615466 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615478 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615485 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615492 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615499 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615505 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615513 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615519 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615526 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615535 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615546 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615553 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615560 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615566 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615573 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615579 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615586 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615593 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615599 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615606 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615615 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615626 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615654 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615662 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615669 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615702 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615710 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615717 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615724 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615731 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615738 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" event={"ID":"1853b500-b218-4412-9cbc-9fd0a76778c0","Type":"ContainerDied","Data":"da34b91f09c49509e843256b513d328bb83f8b3f9364e0a11f24b39df1a668ca"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615762 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615770 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615778 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615784 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615793 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615800 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615806 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615812 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615819 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615826 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.615842 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.616027 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d5svt" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.625853 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/2.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.626613 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/1.log" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.626687 4841 generic.go:334] "Generic (PLEG): container finished" podID="0752d936-15ef-4e17-8463-3185a4c1863b" containerID="60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05" exitCode=2 Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.626732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerDied","Data":"60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.626769 4841 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4"} Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.627462 4841 scope.go:117] "RemoveContainer" containerID="60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.627801 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qwsc4_openshift-multus(0752d936-15ef-4e17-8463-3185a4c1863b)\"" pod="openshift-multus/multus-qwsc4" podUID="0752d936-15ef-4e17-8463-3185a4c1863b" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.655559 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.690064 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5svt"] Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.696692 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-d5svt"] Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.708192 4841 scope.go:117] "RemoveContainer" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.732966 4841 scope.go:117] "RemoveContainer" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.754593 4841 scope.go:117] "RemoveContainer" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.771958 4841 scope.go:117] "RemoveContainer" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.792268 4841 scope.go:117] "RemoveContainer" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.809490 4841 scope.go:117] "RemoveContainer" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.868646 4841 scope.go:117] "RemoveContainer" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.883607 4841 scope.go:117] "RemoveContainer" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.895188 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.895452 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.895485 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} err="failed to get container status \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.895509 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.895802 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": container with ID starting with 0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8 not found: ID does not exist" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.895848 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} err="failed to get container status \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": rpc error: code = NotFound desc = could not find container \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": container with ID starting with 0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.895876 4841 scope.go:117] "RemoveContainer" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.896095 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": container with ID starting with 1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3 not found: ID does not exist" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.896126 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} err="failed to get container status \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": rpc error: code = NotFound desc = could not find container \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": container with ID starting with 1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.896146 4841 scope.go:117] "RemoveContainer" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.896418 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": container with ID starting with ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2 not found: ID does not exist" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.896455 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} err="failed to get container status \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": rpc error: code = NotFound desc = could not find container \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": container with ID starting with ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.896482 4841 scope.go:117] "RemoveContainer" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.898145 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": container with ID starting with 1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1 not found: ID does not exist" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.898179 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} err="failed to get container status \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": rpc error: code = NotFound desc = could not find container \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": container with ID starting with 1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.898198 4841 scope.go:117] "RemoveContainer" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.899116 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": container with ID starting with e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84 not found: ID does not exist" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899145 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} err="failed to get container status \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": rpc error: code = NotFound desc = could not find container \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": container with ID starting with e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899163 4841 scope.go:117] "RemoveContainer" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.899447 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": container with ID starting with de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c not found: ID does not exist" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899470 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} err="failed to get container status \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": rpc error: code = NotFound desc = could not find container \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": container with ID starting with de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899488 4841 scope.go:117] "RemoveContainer" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.899819 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": container with ID starting with 0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa not found: ID does not exist" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899847 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} err="failed to get container status \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": rpc error: code = NotFound desc = could not find container \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": container with ID starting with 0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.899864 4841 scope.go:117] "RemoveContainer" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.900149 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": container with ID starting with daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d not found: ID does not exist" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.900176 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} err="failed to get container status \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": rpc error: code = NotFound desc = could not find container \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": container with ID starting with daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.900192 4841 scope.go:117] "RemoveContainer" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: E1203 17:11:42.901528 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": container with ID starting with 9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc not found: ID does not exist" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.901556 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} err="failed to get container status \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": rpc error: code = NotFound desc = could not find container \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": container with ID starting with 9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.901574 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.901857 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} err="failed to get container status \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.901882 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.902399 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} err="failed to get container status \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": rpc error: code = NotFound desc = could not find container \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": container with ID starting with 0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.902423 4841 scope.go:117] "RemoveContainer" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905462 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} err="failed to get container status \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": rpc error: code = NotFound desc = could not find container \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": container with ID starting with 1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905492 4841 scope.go:117] "RemoveContainer" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905727 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} err="failed to get container status \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": rpc error: code = NotFound desc = could not find container \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": container with ID starting with ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905746 4841 scope.go:117] "RemoveContainer" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905971 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} err="failed to get container status \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": rpc error: code = NotFound desc = could not find container \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": container with ID starting with 1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.905990 4841 scope.go:117] "RemoveContainer" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906253 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} err="failed to get container status \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": rpc error: code = NotFound desc = could not find container \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": container with ID starting with e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906288 4841 scope.go:117] "RemoveContainer" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906523 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} err="failed to get container status \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": rpc error: code = NotFound desc = could not find container \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": container with ID starting with de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906547 4841 scope.go:117] "RemoveContainer" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906746 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} err="failed to get container status \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": rpc error: code = NotFound desc = could not find container \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": container with ID starting with 0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906771 4841 scope.go:117] "RemoveContainer" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906954 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} err="failed to get container status \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": rpc error: code = NotFound desc = could not find container \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": container with ID starting with daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.906975 4841 scope.go:117] "RemoveContainer" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907161 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} err="failed to get container status \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": rpc error: code = NotFound desc = could not find container \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": container with ID starting with 9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907179 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907364 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} err="failed to get container status \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907387 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907639 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} err="failed to get container status \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": rpc error: code = NotFound desc = could not find container \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": container with ID starting with 0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907665 4841 scope.go:117] "RemoveContainer" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907885 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} err="failed to get container status \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": rpc error: code = NotFound desc = could not find container \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": container with ID starting with 1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.907930 4841 scope.go:117] "RemoveContainer" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.908120 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} err="failed to get container status \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": rpc error: code = NotFound desc = could not find container \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": container with ID starting with ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.908138 4841 scope.go:117] "RemoveContainer" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.908386 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} err="failed to get container status \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": rpc error: code = NotFound desc = could not find container \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": container with ID starting with 1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.908404 4841 scope.go:117] "RemoveContainer" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.919420 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} err="failed to get container status \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": rpc error: code = NotFound desc = could not find container \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": container with ID starting with e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.919468 4841 scope.go:117] "RemoveContainer" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.920098 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} err="failed to get container status \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": rpc error: code = NotFound desc = could not find container \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": container with ID starting with de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.920123 4841 scope.go:117] "RemoveContainer" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.922242 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} err="failed to get container status \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": rpc error: code = NotFound desc = could not find container \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": container with ID starting with 0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.922297 4841 scope.go:117] "RemoveContainer" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.922770 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} err="failed to get container status \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": rpc error: code = NotFound desc = could not find container \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": container with ID starting with daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.922803 4841 scope.go:117] "RemoveContainer" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923035 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} err="failed to get container status \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": rpc error: code = NotFound desc = could not find container \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": container with ID starting with 9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923061 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923308 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} err="failed to get container status \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923345 4841 scope.go:117] "RemoveContainer" containerID="0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923602 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8"} err="failed to get container status \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": rpc error: code = NotFound desc = could not find container \"0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8\": container with ID starting with 0e1ccf6df6d681ff361fb846976760e73e7338ab7f0ce9dfe2483d9ce5847bc8 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923649 4841 scope.go:117] "RemoveContainer" containerID="1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923948 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3"} err="failed to get container status \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": rpc error: code = NotFound desc = could not find container \"1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3\": container with ID starting with 1baefb2e5a62a42b179fec089462cf60ed9e930bf2ea7a272de54d16b4a93dd3 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.923972 4841 scope.go:117] "RemoveContainer" containerID="ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.924358 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2"} err="failed to get container status \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": rpc error: code = NotFound desc = could not find container \"ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2\": container with ID starting with ee6e6093e21b8566698bbf633d1b30cdd90ed21707fc6650809e955bea8f43f2 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.924385 4841 scope.go:117] "RemoveContainer" containerID="1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.924958 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1"} err="failed to get container status \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": rpc error: code = NotFound desc = could not find container \"1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1\": container with ID starting with 1db50af49d825a630f9366541beed85397e4a93ee75386a79fb527180a28d9a1 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.924985 4841 scope.go:117] "RemoveContainer" containerID="e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.925284 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84"} err="failed to get container status \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": rpc error: code = NotFound desc = could not find container \"e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84\": container with ID starting with e19609055583615d791394730073c1c95ea6fb14572be7ddc36e1a9c3652ec84 not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.925319 4841 scope.go:117] "RemoveContainer" containerID="de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.925634 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c"} err="failed to get container status \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": rpc error: code = NotFound desc = could not find container \"de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c\": container with ID starting with de885d48fcf1851c75aa2cc06dd295950c5e0d793aff146d835fbaf4157b6d2c not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.925661 4841 scope.go:117] "RemoveContainer" containerID="0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.925984 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa"} err="failed to get container status \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": rpc error: code = NotFound desc = could not find container \"0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa\": container with ID starting with 0c2f3a0c261030c29a127a482da0d77a34969375a70ac08375792a646d21c1fa not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926010 4841 scope.go:117] "RemoveContainer" containerID="daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926243 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d"} err="failed to get container status \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": rpc error: code = NotFound desc = could not find container \"daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d\": container with ID starting with daaf8b8f26f70a7b5184279e615f180d3f3419a596baf9b2c4bc2eb93c33a13d not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926264 4841 scope.go:117] "RemoveContainer" containerID="9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926536 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc"} err="failed to get container status \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": rpc error: code = NotFound desc = could not find container \"9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc\": container with ID starting with 9fcf2ec44f6b493bd4c27d26be4a159c53e610338201da820b880d38f659bbcc not found: ID does not exist" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926552 4841 scope.go:117] "RemoveContainer" containerID="170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421" Dec 03 17:11:42 crc kubenswrapper[4841]: I1203 17:11:42.926752 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421"} err="failed to get container status \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": rpc error: code = NotFound desc = could not find container \"170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421\": container with ID starting with 170a8ef92e763e1d0a0fad7b58d5a418e8cabf5349c3ba854f64c36c83720421 not found: ID does not exist" Dec 03 17:11:43 crc kubenswrapper[4841]: I1203 17:11:43.637612 4841 generic.go:334] "Generic (PLEG): container finished" podID="773ebca5-a623-4df5-94e2-76c5d15c9658" containerID="888c144da45f0c7c00a728933e5456675029c0299c750229bbd22dfaeca2dddb" exitCode=0 Dec 03 17:11:43 crc kubenswrapper[4841]: I1203 17:11:43.637702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerDied","Data":"888c144da45f0c7c00a728933e5456675029c0299c750229bbd22dfaeca2dddb"} Dec 03 17:11:43 crc kubenswrapper[4841]: I1203 17:11:43.637819 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"c3cd727d240dea2667b290050ab0617d2203984826ca226db04e00d6fcaddc82"} Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.260870 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1853b500-b218-4412-9cbc-9fd0a76778c0" path="/var/lib/kubelet/pods/1853b500-b218-4412-9cbc-9fd0a76778c0/volumes" Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.649753 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"b1520edf29603211cb819843f6c3ae07dde25d59e8be208ae638bf3ed319d701"} Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.652768 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"9cffa45152e4b8fb2643e7ffc7da4f05a69ab9f88396e7d1f1e25abaa77d8169"} Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.652973 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"2bbdd6eddd4fa6b37799bbc425637b26fbe1691f57cfa1dadb07bea4f7fcaba5"} Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.653123 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"d6d85f416035e5fa4aaee0f917921bbd73ad521cba7916131813ae7367a4d444"} Dec 03 17:11:44 crc kubenswrapper[4841]: I1203 17:11:44.653296 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"61346d1d8838a29a1210de804f9db52c3ff9052c68bff6e1bf62e1ccc5f408cc"} Dec 03 17:11:45 crc kubenswrapper[4841]: I1203 17:11:45.665158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"474ffb757321774cae6df03a5b022df7600429f5f33ff304d277a95399ac726a"} Dec 03 17:11:46 crc kubenswrapper[4841]: I1203 17:11:46.414172 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-2c5w9" Dec 03 17:11:47 crc kubenswrapper[4841]: I1203 17:11:47.684182 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"1348028f20a28182eba36b4ee44e0552a73835c51a80e3290ad0023c7edf2f72"} Dec 03 17:11:49 crc kubenswrapper[4841]: I1203 17:11:49.702668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" event={"ID":"773ebca5-a623-4df5-94e2-76c5d15c9658","Type":"ContainerStarted","Data":"7b27aef872784c9785974213c7a02802e682e0a5854ac6705cec1fa1abb0c607"} Dec 03 17:11:49 crc kubenswrapper[4841]: I1203 17:11:49.703054 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:49 crc kubenswrapper[4841]: I1203 17:11:49.732411 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" podStartSLOduration=7.732396016 podStartE2EDuration="7.732396016s" podCreationTimestamp="2025-12-03 17:11:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:11:49.731444632 +0000 UTC m=+704.118965349" watchObservedRunningTime="2025-12-03 17:11:49.732396016 +0000 UTC m=+704.119916743" Dec 03 17:11:49 crc kubenswrapper[4841]: I1203 17:11:49.740333 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:50 crc kubenswrapper[4841]: I1203 17:11:50.707751 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:50 crc kubenswrapper[4841]: I1203 17:11:50.708108 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:50 crc kubenswrapper[4841]: I1203 17:11:50.738932 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:11:58 crc kubenswrapper[4841]: I1203 17:11:58.238511 4841 scope.go:117] "RemoveContainer" containerID="60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05" Dec 03 17:11:58 crc kubenswrapper[4841]: E1203 17:11:58.239427 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-qwsc4_openshift-multus(0752d936-15ef-4e17-8463-3185a4c1863b)\"" pod="openshift-multus/multus-qwsc4" podUID="0752d936-15ef-4e17-8463-3185a4c1863b" Dec 03 17:12:06 crc kubenswrapper[4841]: I1203 17:12:06.519992 4841 scope.go:117] "RemoveContainer" containerID="f7fc99131841665710d653ae9f7daca7a5437a3bb35a9fd9fe57f359493bc7e4" Dec 03 17:12:06 crc kubenswrapper[4841]: I1203 17:12:06.813067 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/2.log" Dec 03 17:12:12 crc kubenswrapper[4841]: I1203 17:12:12.615791 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6nzl7" Dec 03 17:12:13 crc kubenswrapper[4841]: I1203 17:12:13.239409 4841 scope.go:117] "RemoveContainer" containerID="60feff1c96c8347251d94b6c38b88ed2e6cc0f7947e8fdbbd1c1657433cccc05" Dec 03 17:12:13 crc kubenswrapper[4841]: I1203 17:12:13.862892 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qwsc4_0752d936-15ef-4e17-8463-3185a4c1863b/kube-multus/2.log" Dec 03 17:12:13 crc kubenswrapper[4841]: I1203 17:12:13.863321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qwsc4" event={"ID":"0752d936-15ef-4e17-8463-3185a4c1863b","Type":"ContainerStarted","Data":"9d9a36445500dc7289d96ba55ed15a0240a456d9312e370014e53784dd9ba831"} Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.307177 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt"] Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.308954 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.316294 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.319928 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt"] Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.473578 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk8fv\" (UniqueName: \"kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.473668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.473725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.575460 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk8fv\" (UniqueName: \"kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.575523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.575557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.576055 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.576357 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.609853 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk8fv\" (UniqueName: \"kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.637671 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.898723 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt"] Dec 03 17:12:28 crc kubenswrapper[4841]: I1203 17:12:28.967885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerStarted","Data":"f5cab51bd824978af8238d802785100a46d42b7b42ca431ddbdde8553312b23d"} Dec 03 17:12:29 crc kubenswrapper[4841]: I1203 17:12:29.982563 4841 generic.go:334] "Generic (PLEG): container finished" podID="1f10118c-20c4-47f3-b078-673dd01ce685" containerID="3a21eb34c8e2b5fc94505456fb1ac76c658aeda7001b1fd58c56167a14f3c070" exitCode=0 Dec 03 17:12:29 crc kubenswrapper[4841]: I1203 17:12:29.982658 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerDied","Data":"3a21eb34c8e2b5fc94505456fb1ac76c658aeda7001b1fd58c56167a14f3c070"} Dec 03 17:12:30 crc kubenswrapper[4841]: I1203 17:12:30.998337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerStarted","Data":"27b01af80650de63eaca378274e8a3b35481145b0581c3b40c61415e09f2d025"} Dec 03 17:12:32 crc kubenswrapper[4841]: I1203 17:12:32.007829 4841 generic.go:334] "Generic (PLEG): container finished" podID="1f10118c-20c4-47f3-b078-673dd01ce685" containerID="27b01af80650de63eaca378274e8a3b35481145b0581c3b40c61415e09f2d025" exitCode=0 Dec 03 17:12:32 crc kubenswrapper[4841]: I1203 17:12:32.007984 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerDied","Data":"27b01af80650de63eaca378274e8a3b35481145b0581c3b40c61415e09f2d025"} Dec 03 17:12:33 crc kubenswrapper[4841]: I1203 17:12:33.019990 4841 generic.go:334] "Generic (PLEG): container finished" podID="1f10118c-20c4-47f3-b078-673dd01ce685" containerID="465bb3e4aec3633d98e1775347f422d32cdef0d79d0bd3ced02111bdb45f9d09" exitCode=0 Dec 03 17:12:33 crc kubenswrapper[4841]: I1203 17:12:33.020098 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerDied","Data":"465bb3e4aec3633d98e1775347f422d32cdef0d79d0bd3ced02111bdb45f9d09"} Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.362650 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.484237 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sk8fv\" (UniqueName: \"kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv\") pod \"1f10118c-20c4-47f3-b078-673dd01ce685\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.484279 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle\") pod \"1f10118c-20c4-47f3-b078-673dd01ce685\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.484329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util\") pod \"1f10118c-20c4-47f3-b078-673dd01ce685\" (UID: \"1f10118c-20c4-47f3-b078-673dd01ce685\") " Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.485520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle" (OuterVolumeSpecName: "bundle") pod "1f10118c-20c4-47f3-b078-673dd01ce685" (UID: "1f10118c-20c4-47f3-b078-673dd01ce685"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.491650 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv" (OuterVolumeSpecName: "kube-api-access-sk8fv") pod "1f10118c-20c4-47f3-b078-673dd01ce685" (UID: "1f10118c-20c4-47f3-b078-673dd01ce685"). InnerVolumeSpecName "kube-api-access-sk8fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.497751 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util" (OuterVolumeSpecName: "util") pod "1f10118c-20c4-47f3-b078-673dd01ce685" (UID: "1f10118c-20c4-47f3-b078-673dd01ce685"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.585750 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sk8fv\" (UniqueName: \"kubernetes.io/projected/1f10118c-20c4-47f3-b078-673dd01ce685-kube-api-access-sk8fv\") on node \"crc\" DevicePath \"\"" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.585788 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:12:34 crc kubenswrapper[4841]: I1203 17:12:34.585801 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1f10118c-20c4-47f3-b078-673dd01ce685-util\") on node \"crc\" DevicePath \"\"" Dec 03 17:12:35 crc kubenswrapper[4841]: I1203 17:12:35.039563 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" event={"ID":"1f10118c-20c4-47f3-b078-673dd01ce685","Type":"ContainerDied","Data":"f5cab51bd824978af8238d802785100a46d42b7b42ca431ddbdde8553312b23d"} Dec 03 17:12:35 crc kubenswrapper[4841]: I1203 17:12:35.039635 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f5cab51bd824978af8238d802785100a46d42b7b42ca431ddbdde8553312b23d" Dec 03 17:12:35 crc kubenswrapper[4841]: I1203 17:12:35.039688 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.552438 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m"] Dec 03 17:12:37 crc kubenswrapper[4841]: E1203 17:12:37.552889 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="extract" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.552918 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="extract" Dec 03 17:12:37 crc kubenswrapper[4841]: E1203 17:12:37.552939 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="util" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.552947 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="util" Dec 03 17:12:37 crc kubenswrapper[4841]: E1203 17:12:37.552967 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="pull" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.552975 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="pull" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.553083 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f10118c-20c4-47f3-b078-673dd01ce685" containerName="extract" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.553479 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.556366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7jxvc" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.556431 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.556595 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.561522 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m"] Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.628629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gzgp\" (UniqueName: \"kubernetes.io/projected/cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0-kube-api-access-7gzgp\") pod \"nmstate-operator-5b5b58f5c8-qw72m\" (UID: \"cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.729986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gzgp\" (UniqueName: \"kubernetes.io/projected/cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0-kube-api-access-7gzgp\") pod \"nmstate-operator-5b5b58f5c8-qw72m\" (UID: \"cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.756160 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gzgp\" (UniqueName: \"kubernetes.io/projected/cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0-kube-api-access-7gzgp\") pod \"nmstate-operator-5b5b58f5c8-qw72m\" (UID: \"cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" Dec 03 17:12:37 crc kubenswrapper[4841]: I1203 17:12:37.866712 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" Dec 03 17:12:38 crc kubenswrapper[4841]: I1203 17:12:38.140534 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m"] Dec 03 17:12:39 crc kubenswrapper[4841]: I1203 17:12:39.073397 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" event={"ID":"cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0","Type":"ContainerStarted","Data":"98fd214bc6d53a1ba2afd95bb5c044815ce943a631da7a8e5b31323b625f3cb3"} Dec 03 17:12:39 crc kubenswrapper[4841]: I1203 17:12:39.316160 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:12:39 crc kubenswrapper[4841]: I1203 17:12:39.316391 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:12:39 crc kubenswrapper[4841]: I1203 17:12:39.780602 4841 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 17:12:41 crc kubenswrapper[4841]: I1203 17:12:41.089358 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" event={"ID":"cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0","Type":"ContainerStarted","Data":"12192ecfc6490f7a48bb36e28e345a1f7afd0fbbc993aa5d34617e393025c8d9"} Dec 03 17:12:41 crc kubenswrapper[4841]: I1203 17:12:41.114234 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-qw72m" podStartSLOduration=1.621235845 podStartE2EDuration="4.114214883s" podCreationTimestamp="2025-12-03 17:12:37 +0000 UTC" firstStartedPulling="2025-12-03 17:12:38.151681594 +0000 UTC m=+752.539202341" lastFinishedPulling="2025-12-03 17:12:40.644660652 +0000 UTC m=+755.032181379" observedRunningTime="2025-12-03 17:12:41.111202577 +0000 UTC m=+755.498723324" watchObservedRunningTime="2025-12-03 17:12:41.114214883 +0000 UTC m=+755.501735630" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.057383 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.058335 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.060757 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bb6wx" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.072155 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.076474 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.077096 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.078562 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.093668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cm6t\" (UniqueName: \"kubernetes.io/projected/61ca4cad-30b3-4672-ae6c-59fd14e78a4a-kube-api-access-6cm6t\") pod \"nmstate-metrics-7f946cbc9-fq2q6\" (UID: \"61ca4cad-30b3-4672-ae6c-59fd14e78a4a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.107037 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.120492 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-mk7jv"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.121322 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.189984 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.190590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.195231 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.198464 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199087 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-wkftt" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199557 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-dbus-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199641 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cm6t\" (UniqueName: \"kubernetes.io/projected/61ca4cad-30b3-4672-ae6c-59fd14e78a4a-kube-api-access-6cm6t\") pod \"nmstate-metrics-7f946cbc9-fq2q6\" (UID: \"61ca4cad-30b3-4672-ae6c-59fd14e78a4a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-nmstate-lock\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddq4r\" (UniqueName: \"kubernetes.io/projected/0b352e6a-f766-4261-87a1-5e71b591df3b-kube-api-access-ddq4r\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199915 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b352e6a-f766-4261-87a1-5e71b591df3b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199940 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-ovs-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.199953 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzjc9\" (UniqueName: \"kubernetes.io/projected/7d600af9-9363-42fd-9b6c-dcf7181dc09b-kube-api-access-qzjc9\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.239395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cm6t\" (UniqueName: \"kubernetes.io/projected/61ca4cad-30b3-4672-ae6c-59fd14e78a4a-kube-api-access-6cm6t\") pod \"nmstate-metrics-7f946cbc9-fq2q6\" (UID: \"61ca4cad-30b3-4672-ae6c-59fd14e78a4a\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301082 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5bdd62f2-102a-4f3a-80aa-e3600df311a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-dbus-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5b6\" (UniqueName: \"kubernetes.io/projected/5bdd62f2-102a-4f3a-80aa-e3600df311a9-kube-api-access-lz5b6\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301182 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-nmstate-lock\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddq4r\" (UniqueName: \"kubernetes.io/projected/0b352e6a-f766-4261-87a1-5e71b591df3b-kube-api-access-ddq4r\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd62f2-102a-4f3a-80aa-e3600df311a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301229 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b352e6a-f766-4261-87a1-5e71b591df3b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301251 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-ovs-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.301266 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzjc9\" (UniqueName: \"kubernetes.io/projected/7d600af9-9363-42fd-9b6c-dcf7181dc09b-kube-api-access-qzjc9\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.302457 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-dbus-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.302529 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-nmstate-lock\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.302616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7d600af9-9363-42fd-9b6c-dcf7181dc09b-ovs-socket\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.306563 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/0b352e6a-f766-4261-87a1-5e71b591df3b-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.316984 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzjc9\" (UniqueName: \"kubernetes.io/projected/7d600af9-9363-42fd-9b6c-dcf7181dc09b-kube-api-access-qzjc9\") pod \"nmstate-handler-mk7jv\" (UID: \"7d600af9-9363-42fd-9b6c-dcf7181dc09b\") " pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.323587 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddq4r\" (UniqueName: \"kubernetes.io/projected/0b352e6a-f766-4261-87a1-5e71b591df3b-kube-api-access-ddq4r\") pod \"nmstate-webhook-5f6d4c5ccb-85htl\" (UID: \"0b352e6a-f766-4261-87a1-5e71b591df3b\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.373452 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.402047 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5bdd62f2-102a-4f3a-80aa-e3600df311a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.402106 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5b6\" (UniqueName: \"kubernetes.io/projected/5bdd62f2-102a-4f3a-80aa-e3600df311a9-kube-api-access-lz5b6\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.402139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd62f2-102a-4f3a-80aa-e3600df311a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.402968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5bdd62f2-102a-4f3a-80aa-e3600df311a9-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.406121 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.406990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/5bdd62f2-102a-4f3a-80aa-e3600df311a9-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.407213 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64568db54c-pdf8j"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.407788 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.426810 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64568db54c-pdf8j"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.427373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5b6\" (UniqueName: \"kubernetes.io/projected/5bdd62f2-102a-4f3a-80aa-e3600df311a9-kube-api-access-lz5b6\") pod \"nmstate-console-plugin-7fbb5f6569-d4ssd\" (UID: \"5bdd62f2-102a-4f3a-80aa-e3600df311a9\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.438266 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.508770 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb6mk\" (UniqueName: \"kubernetes.io/projected/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-kube-api-access-cb6mk\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509206 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-oauth-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-trusted-ca-bundle\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509268 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-service-ca\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.509417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-oauth-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.513238 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.588175 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6"] Dec 03 17:12:42 crc kubenswrapper[4841]: W1203 17:12:42.594412 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61ca4cad_30b3_4672_ae6c_59fd14e78a4a.slice/crio-b78cff8da070bd1551429e58c3c7c63a68330e2153705968c74a14312b5c5d2b WatchSource:0}: Error finding container b78cff8da070bd1551429e58c3c7c63a68330e2153705968c74a14312b5c5d2b: Status 404 returned error can't find the container with id b78cff8da070bd1551429e58c3c7c63a68330e2153705968c74a14312b5c5d2b Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.611889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-oauth-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.611967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-trusted-ca-bundle\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.612016 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.612038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-service-ca\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.612061 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.612091 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-oauth-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.612136 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb6mk\" (UniqueName: \"kubernetes.io/projected/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-kube-api-access-cb6mk\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.614217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-service-ca\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.619575 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-oauth-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.620896 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-config\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.622383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-trusted-ca-bundle\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.622698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-console-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.622705 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-oauth-serving-cert\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.628787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb6mk\" (UniqueName: \"kubernetes.io/projected/6c850bd9-bc13-4272-8cda-3c8dbd45de4d-kube-api-access-cb6mk\") pod \"console-64568db54c-pdf8j\" (UID: \"6c850bd9-bc13-4272-8cda-3c8dbd45de4d\") " pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.631295 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl"] Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.694216 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd"] Dec 03 17:12:42 crc kubenswrapper[4841]: W1203 17:12:42.698052 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bdd62f2_102a_4f3a_80aa_e3600df311a9.slice/crio-ded9761da91afa18e7d98cae95f4c58332c3d0688913ad101d2fb99a44d2e907 WatchSource:0}: Error finding container ded9761da91afa18e7d98cae95f4c58332c3d0688913ad101d2fb99a44d2e907: Status 404 returned error can't find the container with id ded9761da91afa18e7d98cae95f4c58332c3d0688913ad101d2fb99a44d2e907 Dec 03 17:12:42 crc kubenswrapper[4841]: I1203 17:12:42.723099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:43 crc kubenswrapper[4841]: I1203 17:12:43.100335 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" event={"ID":"0b352e6a-f766-4261-87a1-5e71b591df3b","Type":"ContainerStarted","Data":"ade3f3a2fa174db311892db83adef61fa783b178e39f289ec6a01cab1c6157f4"} Dec 03 17:12:43 crc kubenswrapper[4841]: I1203 17:12:43.101721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" event={"ID":"61ca4cad-30b3-4672-ae6c-59fd14e78a4a","Type":"ContainerStarted","Data":"b78cff8da070bd1551429e58c3c7c63a68330e2153705968c74a14312b5c5d2b"} Dec 03 17:12:43 crc kubenswrapper[4841]: I1203 17:12:43.102718 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mk7jv" event={"ID":"7d600af9-9363-42fd-9b6c-dcf7181dc09b","Type":"ContainerStarted","Data":"d9e31ddd6b798adcc131cbb7e999ce1876f1889ee7bcfb69faad99f0a93decf3"} Dec 03 17:12:43 crc kubenswrapper[4841]: I1203 17:12:43.104304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" event={"ID":"5bdd62f2-102a-4f3a-80aa-e3600df311a9","Type":"ContainerStarted","Data":"ded9761da91afa18e7d98cae95f4c58332c3d0688913ad101d2fb99a44d2e907"} Dec 03 17:12:43 crc kubenswrapper[4841]: I1203 17:12:43.150217 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64568db54c-pdf8j"] Dec 03 17:12:43 crc kubenswrapper[4841]: W1203 17:12:43.154213 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c850bd9_bc13_4272_8cda_3c8dbd45de4d.slice/crio-84c3fbceb6df350f392c1728734c55ad5b2fa0c415f0d53e80f22fca592ad1b1 WatchSource:0}: Error finding container 84c3fbceb6df350f392c1728734c55ad5b2fa0c415f0d53e80f22fca592ad1b1: Status 404 returned error can't find the container with id 84c3fbceb6df350f392c1728734c55ad5b2fa0c415f0d53e80f22fca592ad1b1 Dec 03 17:12:44 crc kubenswrapper[4841]: I1203 17:12:44.111553 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64568db54c-pdf8j" event={"ID":"6c850bd9-bc13-4272-8cda-3c8dbd45de4d","Type":"ContainerStarted","Data":"bc448dcfe41285f64ff6292ea68812bc1232432bbb9f7ef77581b9525b2f80d7"} Dec 03 17:12:44 crc kubenswrapper[4841]: I1203 17:12:44.111825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64568db54c-pdf8j" event={"ID":"6c850bd9-bc13-4272-8cda-3c8dbd45de4d","Type":"ContainerStarted","Data":"84c3fbceb6df350f392c1728734c55ad5b2fa0c415f0d53e80f22fca592ad1b1"} Dec 03 17:12:44 crc kubenswrapper[4841]: I1203 17:12:44.131469 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64568db54c-pdf8j" podStartSLOduration=2.131448537 podStartE2EDuration="2.131448537s" podCreationTimestamp="2025-12-03 17:12:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:12:44.125495915 +0000 UTC m=+758.513016642" watchObservedRunningTime="2025-12-03 17:12:44.131448537 +0000 UTC m=+758.518969264" Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.123337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-mk7jv" event={"ID":"7d600af9-9363-42fd-9b6c-dcf7181dc09b","Type":"ContainerStarted","Data":"b002e77309955cb5ee796fbaedbda8922770ca2da7aaee8240c19f75b24bfd7b"} Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.124091 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.125645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" event={"ID":"0b352e6a-f766-4261-87a1-5e71b591df3b","Type":"ContainerStarted","Data":"7981c5bd433437025104673910c508f9d9ab9de699f656fcbc2089cf0696df4d"} Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.125804 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.128696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" event={"ID":"61ca4cad-30b3-4672-ae6c-59fd14e78a4a","Type":"ContainerStarted","Data":"8b217d6279f422ddc273948663476e53e135f2fa45ea1ed97c8e1ddbe7075e7f"} Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.138891 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-mk7jv" podStartSLOduration=1.5931236709999999 podStartE2EDuration="4.138872185s" podCreationTimestamp="2025-12-03 17:12:42 +0000 UTC" firstStartedPulling="2025-12-03 17:12:42.462482307 +0000 UTC m=+756.850003034" lastFinishedPulling="2025-12-03 17:12:45.008230811 +0000 UTC m=+759.395751548" observedRunningTime="2025-12-03 17:12:46.137243554 +0000 UTC m=+760.524764281" watchObservedRunningTime="2025-12-03 17:12:46.138872185 +0000 UTC m=+760.526392922" Dec 03 17:12:46 crc kubenswrapper[4841]: I1203 17:12:46.154899 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" podStartSLOduration=1.8012651979999998 podStartE2EDuration="4.154873733s" podCreationTimestamp="2025-12-03 17:12:42 +0000 UTC" firstStartedPulling="2025-12-03 17:12:42.639237603 +0000 UTC m=+757.026758330" lastFinishedPulling="2025-12-03 17:12:44.992846118 +0000 UTC m=+759.380366865" observedRunningTime="2025-12-03 17:12:46.15081454 +0000 UTC m=+760.538335267" watchObservedRunningTime="2025-12-03 17:12:46.154873733 +0000 UTC m=+760.542394500" Dec 03 17:12:48 crc kubenswrapper[4841]: I1203 17:12:48.148029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" event={"ID":"61ca4cad-30b3-4672-ae6c-59fd14e78a4a","Type":"ContainerStarted","Data":"a1be5ed937127a107898ea52d8057ffc4788a2de8f86f3dab6f7e754febea752"} Dec 03 17:12:48 crc kubenswrapper[4841]: I1203 17:12:48.172472 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-fq2q6" podStartSLOduration=1.513702278 podStartE2EDuration="6.172447302s" podCreationTimestamp="2025-12-03 17:12:42 +0000 UTC" firstStartedPulling="2025-12-03 17:12:42.59675981 +0000 UTC m=+756.984280537" lastFinishedPulling="2025-12-03 17:12:47.255504834 +0000 UTC m=+761.643025561" observedRunningTime="2025-12-03 17:12:48.169462055 +0000 UTC m=+762.556982802" watchObservedRunningTime="2025-12-03 17:12:48.172447302 +0000 UTC m=+762.559968069" Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.176338 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" event={"ID":"5bdd62f2-102a-4f3a-80aa-e3600df311a9","Type":"ContainerStarted","Data":"c9e68a210dce5a5bc4526dd6e4e96e4b4073968ebd90cf8d4bc85118bf021e68"} Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.198263 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-d4ssd" podStartSLOduration=1.134006257 podStartE2EDuration="10.198237316s" podCreationTimestamp="2025-12-03 17:12:42 +0000 UTC" firstStartedPulling="2025-12-03 17:12:42.699837168 +0000 UTC m=+757.087357895" lastFinishedPulling="2025-12-03 17:12:51.764068197 +0000 UTC m=+766.151588954" observedRunningTime="2025-12-03 17:12:52.197581709 +0000 UTC m=+766.585102486" watchObservedRunningTime="2025-12-03 17:12:52.198237316 +0000 UTC m=+766.585758083" Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.477685 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-mk7jv" Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.723674 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.723769 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:52 crc kubenswrapper[4841]: I1203 17:12:52.734686 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:53 crc kubenswrapper[4841]: I1203 17:12:53.191625 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64568db54c-pdf8j" Dec 03 17:12:53 crc kubenswrapper[4841]: I1203 17:12:53.275020 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:13:02 crc kubenswrapper[4841]: I1203 17:13:02.416760 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-85htl" Dec 03 17:13:09 crc kubenswrapper[4841]: I1203 17:13:09.316684 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:13:09 crc kubenswrapper[4841]: I1203 17:13:09.317318 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.632124 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh"] Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.634496 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.636816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.643800 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh"] Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.688528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfhtp\" (UniqueName: \"kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.688606 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.688656 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.789452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.789511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfhtp\" (UniqueName: \"kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.789581 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.790155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.790448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.809267 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfhtp\" (UniqueName: \"kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:14 crc kubenswrapper[4841]: I1203 17:13:14.953841 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:15 crc kubenswrapper[4841]: I1203 17:13:15.204167 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh"] Dec 03 17:13:15 crc kubenswrapper[4841]: W1203 17:13:15.217701 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88a07844_4e33_407d_887f_abc37124f7e4.slice/crio-e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa WatchSource:0}: Error finding container e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa: Status 404 returned error can't find the container with id e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa Dec 03 17:13:15 crc kubenswrapper[4841]: I1203 17:13:15.341438 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerStarted","Data":"e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa"} Dec 03 17:13:16 crc kubenswrapper[4841]: I1203 17:13:16.349579 4841 generic.go:334] "Generic (PLEG): container finished" podID="88a07844-4e33-407d-887f-abc37124f7e4" containerID="aff6ff1c52599deabf620ebe1db4b0295ae0df8c610b8900d7234565a9e3e555" exitCode=0 Dec 03 17:13:16 crc kubenswrapper[4841]: I1203 17:13:16.349646 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerDied","Data":"aff6ff1c52599deabf620ebe1db4b0295ae0df8c610b8900d7234565a9e3e555"} Dec 03 17:13:16 crc kubenswrapper[4841]: I1203 17:13:16.961634 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:16 crc kubenswrapper[4841]: I1203 17:13:16.963844 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:16 crc kubenswrapper[4841]: I1203 17:13:16.978364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.020138 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpw75\" (UniqueName: \"kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.020501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.020538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.121978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpw75\" (UniqueName: \"kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.122054 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.122076 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.122549 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.122782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.143120 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpw75\" (UniqueName: \"kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75\") pod \"redhat-operators-56dsv\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.299643 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:17 crc kubenswrapper[4841]: I1203 17:13:17.704440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.335074 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-ngr75" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" containerID="cri-o://64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c" gracePeriod=15 Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.366689 4841 generic.go:334] "Generic (PLEG): container finished" podID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerID="e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639" exitCode=0 Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.366803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerDied","Data":"e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639"} Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.366923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerStarted","Data":"fac40d0b3a5ca70b1def00fdd03cea2a805e0f1136316f26ad0cc2e4b72f89c0"} Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.767080 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ngr75_34e90356-ed2e-4e60-9e00-97a1b62d640b/console/0.log" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.767521 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.845718 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8qxn\" (UniqueName: \"kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.845883 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.845936 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.846005 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.846704 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config" (OuterVolumeSpecName: "console-config") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847152 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca\") pod \"34e90356-ed2e-4e60-9e00-97a1b62d640b\" (UID: \"34e90356-ed2e-4e60-9e00-97a1b62d640b\") " Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847784 4841 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847803 4841 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.847930 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.848258 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca" (OuterVolumeSpecName: "service-ca") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.854446 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.855072 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.855500 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn" (OuterVolumeSpecName: "kube-api-access-t8qxn") pod "34e90356-ed2e-4e60-9e00-97a1b62d640b" (UID: "34e90356-ed2e-4e60-9e00-97a1b62d640b"). InnerVolumeSpecName "kube-api-access-t8qxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.949541 4841 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.949624 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8qxn\" (UniqueName: \"kubernetes.io/projected/34e90356-ed2e-4e60-9e00-97a1b62d640b-kube-api-access-t8qxn\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.949641 4841 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.949652 4841 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/34e90356-ed2e-4e60-9e00-97a1b62d640b-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:18 crc kubenswrapper[4841]: I1203 17:13:18.949661 4841 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/34e90356-ed2e-4e60-9e00-97a1b62d640b-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.376096 4841 generic.go:334] "Generic (PLEG): container finished" podID="88a07844-4e33-407d-887f-abc37124f7e4" containerID="2fd970a7dad3ef8390b1dbe781a5b6b7f588817a57c71f8336b4895c972d8920" exitCode=0 Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.376198 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerDied","Data":"2fd970a7dad3ef8390b1dbe781a5b6b7f588817a57c71f8336b4895c972d8920"} Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380140 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-ngr75_34e90356-ed2e-4e60-9e00-97a1b62d640b/console/0.log" Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380199 4841 generic.go:334] "Generic (PLEG): container finished" podID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerID="64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c" exitCode=2 Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380244 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngr75" event={"ID":"34e90356-ed2e-4e60-9e00-97a1b62d640b","Type":"ContainerDied","Data":"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c"} Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380283 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ngr75" event={"ID":"34e90356-ed2e-4e60-9e00-97a1b62d640b","Type":"ContainerDied","Data":"4bc73e1809e9fc2f467c7281ed0da4dae2020f1007687d06f45dadb6d25fd386"} Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380301 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ngr75" Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.380308 4841 scope.go:117] "RemoveContainer" containerID="64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c" Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.424921 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.431451 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-ngr75"] Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.906234 4841 scope.go:117] "RemoveContainer" containerID="64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c" Dec 03 17:13:19 crc kubenswrapper[4841]: E1203 17:13:19.907385 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c\": container with ID starting with 64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c not found: ID does not exist" containerID="64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c" Dec 03 17:13:19 crc kubenswrapper[4841]: I1203 17:13:19.907453 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c"} err="failed to get container status \"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c\": rpc error: code = NotFound desc = could not find container \"64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c\": container with ID starting with 64d8e136af0580e20bc30c1e456a4bf6d14f6fa51e88416cc1c9f971d2bf4e5c not found: ID does not exist" Dec 03 17:13:20 crc kubenswrapper[4841]: I1203 17:13:20.253367 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" path="/var/lib/kubelet/pods/34e90356-ed2e-4e60-9e00-97a1b62d640b/volumes" Dec 03 17:13:20 crc kubenswrapper[4841]: I1203 17:13:20.388792 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerStarted","Data":"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a"} Dec 03 17:13:20 crc kubenswrapper[4841]: I1203 17:13:20.392238 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerStarted","Data":"e41776388e72b7dc7fd5f9010cbe8691ea51b1e23c9e01fb5812fe581c917114"} Dec 03 17:13:20 crc kubenswrapper[4841]: I1203 17:13:20.447735 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" podStartSLOduration=4.15531138 podStartE2EDuration="6.44770161s" podCreationTimestamp="2025-12-03 17:13:14 +0000 UTC" firstStartedPulling="2025-12-03 17:13:16.351355854 +0000 UTC m=+790.738876611" lastFinishedPulling="2025-12-03 17:13:18.643746114 +0000 UTC m=+793.031266841" observedRunningTime="2025-12-03 17:13:20.436542037 +0000 UTC m=+794.824062794" watchObservedRunningTime="2025-12-03 17:13:20.44770161 +0000 UTC m=+794.835222387" Dec 03 17:13:21 crc kubenswrapper[4841]: I1203 17:13:21.405560 4841 generic.go:334] "Generic (PLEG): container finished" podID="88a07844-4e33-407d-887f-abc37124f7e4" containerID="e41776388e72b7dc7fd5f9010cbe8691ea51b1e23c9e01fb5812fe581c917114" exitCode=0 Dec 03 17:13:21 crc kubenswrapper[4841]: I1203 17:13:21.405640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerDied","Data":"e41776388e72b7dc7fd5f9010cbe8691ea51b1e23c9e01fb5812fe581c917114"} Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.429954 4841 generic.go:334] "Generic (PLEG): container finished" podID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerID="da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a" exitCode=0 Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.429969 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerDied","Data":"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a"} Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.713283 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.805152 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle\") pod \"88a07844-4e33-407d-887f-abc37124f7e4\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.805223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util\") pod \"88a07844-4e33-407d-887f-abc37124f7e4\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.805288 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfhtp\" (UniqueName: \"kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp\") pod \"88a07844-4e33-407d-887f-abc37124f7e4\" (UID: \"88a07844-4e33-407d-887f-abc37124f7e4\") " Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.807009 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle" (OuterVolumeSpecName: "bundle") pod "88a07844-4e33-407d-887f-abc37124f7e4" (UID: "88a07844-4e33-407d-887f-abc37124f7e4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.818867 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp" (OuterVolumeSpecName: "kube-api-access-jfhtp") pod "88a07844-4e33-407d-887f-abc37124f7e4" (UID: "88a07844-4e33-407d-887f-abc37124f7e4"). InnerVolumeSpecName "kube-api-access-jfhtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.831536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util" (OuterVolumeSpecName: "util") pod "88a07844-4e33-407d-887f-abc37124f7e4" (UID: "88a07844-4e33-407d-887f-abc37124f7e4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.907170 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.907210 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88a07844-4e33-407d-887f-abc37124f7e4-util\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:22 crc kubenswrapper[4841]: I1203 17:13:22.907223 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfhtp\" (UniqueName: \"kubernetes.io/projected/88a07844-4e33-407d-887f-abc37124f7e4-kube-api-access-jfhtp\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:23 crc kubenswrapper[4841]: I1203 17:13:23.442629 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" event={"ID":"88a07844-4e33-407d-887f-abc37124f7e4","Type":"ContainerDied","Data":"e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa"} Dec 03 17:13:23 crc kubenswrapper[4841]: I1203 17:13:23.442692 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3856bb27f37af715cbf43ce63224a9f90960343628d0557d8cbfe3a739792fa" Dec 03 17:13:23 crc kubenswrapper[4841]: I1203 17:13:23.442805 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh" Dec 03 17:13:23 crc kubenswrapper[4841]: I1203 17:13:23.447855 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerStarted","Data":"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95"} Dec 03 17:13:23 crc kubenswrapper[4841]: I1203 17:13:23.484099 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-56dsv" podStartSLOduration=2.8995610960000002 podStartE2EDuration="7.484069659s" podCreationTimestamp="2025-12-03 17:13:16 +0000 UTC" firstStartedPulling="2025-12-03 17:13:18.40477951 +0000 UTC m=+792.792300247" lastFinishedPulling="2025-12-03 17:13:22.989288083 +0000 UTC m=+797.376808810" observedRunningTime="2025-12-03 17:13:23.477080341 +0000 UTC m=+797.864601108" watchObservedRunningTime="2025-12-03 17:13:23.484069659 +0000 UTC m=+797.871590416" Dec 03 17:13:27 crc kubenswrapper[4841]: I1203 17:13:27.300723 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:27 crc kubenswrapper[4841]: I1203 17:13:27.301729 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:28 crc kubenswrapper[4841]: I1203 17:13:28.359438 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-56dsv" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="registry-server" probeResult="failure" output=< Dec 03 17:13:28 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:13:28 crc kubenswrapper[4841]: > Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.817579 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm"] Dec 03 17:13:31 crc kubenswrapper[4841]: E1203 17:13:31.818023 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="extract" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818038 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="extract" Dec 03 17:13:31 crc kubenswrapper[4841]: E1203 17:13:31.818048 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="util" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818054 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="util" Dec 03 17:13:31 crc kubenswrapper[4841]: E1203 17:13:31.818067 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818075 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" Dec 03 17:13:31 crc kubenswrapper[4841]: E1203 17:13:31.818087 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="pull" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818092 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="pull" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818201 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="88a07844-4e33-407d-887f-abc37124f7e4" containerName="extract" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818217 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e90356-ed2e-4e60-9e00-97a1b62d640b" containerName="console" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.818565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.820380 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.820688 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-v2j5c" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.820943 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.821512 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.822019 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.845657 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-webhook-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.846050 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-apiservice-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.849360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sr8n\" (UniqueName: \"kubernetes.io/projected/1fbe0c23-9239-43a3-981a-87b5d6f3af82-kube-api-access-6sr8n\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.851195 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm"] Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.950866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-apiservice-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.950943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sr8n\" (UniqueName: \"kubernetes.io/projected/1fbe0c23-9239-43a3-981a-87b5d6f3af82-kube-api-access-6sr8n\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.951006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-webhook-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.959832 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-webhook-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.965785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/1fbe0c23-9239-43a3-981a-87b5d6f3af82-apiservice-cert\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:31 crc kubenswrapper[4841]: I1203 17:13:31.971156 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sr8n\" (UniqueName: \"kubernetes.io/projected/1fbe0c23-9239-43a3-981a-87b5d6f3af82-kube-api-access-6sr8n\") pod \"metallb-operator-controller-manager-67f9cc98fc-kcfzm\" (UID: \"1fbe0c23-9239-43a3-981a-87b5d6f3af82\") " pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.050279 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2"] Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.050948 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.052712 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.053258 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-5ln56" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.053374 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.064061 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2"] Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.143435 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.152375 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-webhook-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.152432 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-apiservice-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.155180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxphm\" (UniqueName: \"kubernetes.io/projected/57d5b20b-d392-41ab-8729-d877277201e0-kube-api-access-jxphm\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.269169 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-webhook-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.269281 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-apiservice-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.269360 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxphm\" (UniqueName: \"kubernetes.io/projected/57d5b20b-d392-41ab-8729-d877277201e0-kube-api-access-jxphm\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.282874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-apiservice-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.286720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxphm\" (UniqueName: \"kubernetes.io/projected/57d5b20b-d392-41ab-8729-d877277201e0-kube-api-access-jxphm\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.294513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/57d5b20b-d392-41ab-8729-d877277201e0-webhook-cert\") pod \"metallb-operator-webhook-server-7cc87bb9cb-96fv2\" (UID: \"57d5b20b-d392-41ab-8729-d877277201e0\") " pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.335767 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm"] Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.368013 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.506979 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" event={"ID":"1fbe0c23-9239-43a3-981a-87b5d6f3af82","Type":"ContainerStarted","Data":"d5e735466a1f1bce06986f28f68ed499c449ec4e91f91f37466ef0a922a168ed"} Dec 03 17:13:32 crc kubenswrapper[4841]: I1203 17:13:32.817875 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2"] Dec 03 17:13:32 crc kubenswrapper[4841]: W1203 17:13:32.828388 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57d5b20b_d392_41ab_8729_d877277201e0.slice/crio-91d41198515c4fecddce12e48caf0e36ccf45d6c122f630eda154352c4aa4bba WatchSource:0}: Error finding container 91d41198515c4fecddce12e48caf0e36ccf45d6c122f630eda154352c4aa4bba: Status 404 returned error can't find the container with id 91d41198515c4fecddce12e48caf0e36ccf45d6c122f630eda154352c4aa4bba Dec 03 17:13:33 crc kubenswrapper[4841]: I1203 17:13:33.511976 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" event={"ID":"57d5b20b-d392-41ab-8729-d877277201e0","Type":"ContainerStarted","Data":"91d41198515c4fecddce12e48caf0e36ccf45d6c122f630eda154352c4aa4bba"} Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.403039 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.443264 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.543106 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" event={"ID":"1fbe0c23-9239-43a3-981a-87b5d6f3af82","Type":"ContainerStarted","Data":"f36cdc354f217851592f86c87e19bd7ad0a2a86187191611ae24780c27d5f8e7"} Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.543408 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.546682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" event={"ID":"57d5b20b-d392-41ab-8729-d877277201e0","Type":"ContainerStarted","Data":"e68e611bb258bb7223370b4b66bd3ef5b63002bf67ddb826751933919d34ae04"} Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.546841 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.570499 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" podStartSLOduration=3.584456773 podStartE2EDuration="6.570480585s" podCreationTimestamp="2025-12-03 17:13:31 +0000 UTC" firstStartedPulling="2025-12-03 17:13:32.354872977 +0000 UTC m=+806.742393704" lastFinishedPulling="2025-12-03 17:13:35.340896789 +0000 UTC m=+809.728417516" observedRunningTime="2025-12-03 17:13:37.566612657 +0000 UTC m=+811.954133414" watchObservedRunningTime="2025-12-03 17:13:37.570480585 +0000 UTC m=+811.958001322" Dec 03 17:13:37 crc kubenswrapper[4841]: I1203 17:13:37.590408 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" podStartSLOduration=1.198510165 podStartE2EDuration="5.5903836s" podCreationTimestamp="2025-12-03 17:13:32 +0000 UTC" firstStartedPulling="2025-12-03 17:13:32.83262537 +0000 UTC m=+807.220146097" lastFinishedPulling="2025-12-03 17:13:37.224498805 +0000 UTC m=+811.612019532" observedRunningTime="2025-12-03 17:13:37.584527891 +0000 UTC m=+811.972048658" watchObservedRunningTime="2025-12-03 17:13:37.5903836 +0000 UTC m=+811.977904367" Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.316473 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.316539 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.316593 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.317250 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.317320 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03" gracePeriod=600 Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.547118 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.547724 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-56dsv" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="registry-server" containerID="cri-o://95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95" gracePeriod=2 Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.560093 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03" exitCode=0 Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.560981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03"} Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.561047 4841 scope.go:117] "RemoveContainer" containerID="eb72285561a179b3790b876fa15c5bce1255f9d0d5aee06faebb894344a15403" Dec 03 17:13:39 crc kubenswrapper[4841]: I1203 17:13:39.968733 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.095458 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities\") pod \"711ad95c-70ec-43fe-abd8-b8372028e6aa\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.095614 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content\") pod \"711ad95c-70ec-43fe-abd8-b8372028e6aa\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.095692 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpw75\" (UniqueName: \"kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75\") pod \"711ad95c-70ec-43fe-abd8-b8372028e6aa\" (UID: \"711ad95c-70ec-43fe-abd8-b8372028e6aa\") " Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.096741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities" (OuterVolumeSpecName: "utilities") pod "711ad95c-70ec-43fe-abd8-b8372028e6aa" (UID: "711ad95c-70ec-43fe-abd8-b8372028e6aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.102201 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75" (OuterVolumeSpecName: "kube-api-access-bpw75") pod "711ad95c-70ec-43fe-abd8-b8372028e6aa" (UID: "711ad95c-70ec-43fe-abd8-b8372028e6aa"). InnerVolumeSpecName "kube-api-access-bpw75". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.197445 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpw75\" (UniqueName: \"kubernetes.io/projected/711ad95c-70ec-43fe-abd8-b8372028e6aa-kube-api-access-bpw75\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.197474 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.229600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "711ad95c-70ec-43fe-abd8-b8372028e6aa" (UID: "711ad95c-70ec-43fe-abd8-b8372028e6aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.299153 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/711ad95c-70ec-43fe-abd8-b8372028e6aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.567309 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a"} Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.569556 4841 generic.go:334] "Generic (PLEG): container finished" podID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerID="95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95" exitCode=0 Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.569596 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerDied","Data":"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95"} Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.569621 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-56dsv" event={"ID":"711ad95c-70ec-43fe-abd8-b8372028e6aa","Type":"ContainerDied","Data":"fac40d0b3a5ca70b1def00fdd03cea2a805e0f1136316f26ad0cc2e4b72f89c0"} Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.569639 4841 scope.go:117] "RemoveContainer" containerID="95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.569650 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-56dsv" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.586602 4841 scope.go:117] "RemoveContainer" containerID="da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.607321 4841 scope.go:117] "RemoveContainer" containerID="e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.613773 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.616978 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-56dsv"] Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.633696 4841 scope.go:117] "RemoveContainer" containerID="95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95" Dec 03 17:13:40 crc kubenswrapper[4841]: E1203 17:13:40.634220 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95\": container with ID starting with 95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95 not found: ID does not exist" containerID="95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.634264 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95"} err="failed to get container status \"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95\": rpc error: code = NotFound desc = could not find container \"95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95\": container with ID starting with 95bf6e3a016c00e4c4c71cf8bac389718d9723aa606fc6c70afe77be9be6cb95 not found: ID does not exist" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.634291 4841 scope.go:117] "RemoveContainer" containerID="da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a" Dec 03 17:13:40 crc kubenswrapper[4841]: E1203 17:13:40.634633 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a\": container with ID starting with da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a not found: ID does not exist" containerID="da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.634664 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a"} err="failed to get container status \"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a\": rpc error: code = NotFound desc = could not find container \"da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a\": container with ID starting with da69b8f712ee2143999424e69908e798d30b150b562f528dbcd083baaa16808a not found: ID does not exist" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.634812 4841 scope.go:117] "RemoveContainer" containerID="e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639" Dec 03 17:13:40 crc kubenswrapper[4841]: E1203 17:13:40.635078 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639\": container with ID starting with e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639 not found: ID does not exist" containerID="e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639" Dec 03 17:13:40 crc kubenswrapper[4841]: I1203 17:13:40.635104 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639"} err="failed to get container status \"e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639\": rpc error: code = NotFound desc = could not find container \"e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639\": container with ID starting with e34e191625ea4f5c0b6d6b176606bf9c998bf993c15bb36e435472cf03765639 not found: ID does not exist" Dec 03 17:13:42 crc kubenswrapper[4841]: I1203 17:13:42.248128 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" path="/var/lib/kubelet/pods/711ad95c-70ec-43fe-abd8-b8372028e6aa/volumes" Dec 03 17:13:52 crc kubenswrapper[4841]: I1203 17:13:52.376080 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-7cc87bb9cb-96fv2" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.147351 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-67f9cc98fc-kcfzm" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.867207 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8"] Dec 03 17:14:12 crc kubenswrapper[4841]: E1203 17:14:12.867451 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="registry-server" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.867470 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="registry-server" Dec 03 17:14:12 crc kubenswrapper[4841]: E1203 17:14:12.867488 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="extract-content" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.867495 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="extract-content" Dec 03 17:14:12 crc kubenswrapper[4841]: E1203 17:14:12.867516 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="extract-utilities" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.867523 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="extract-utilities" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.867633 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="711ad95c-70ec-43fe-abd8-b8372028e6aa" containerName="registry-server" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.868113 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.871382 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-pdwhc" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.871978 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r82gk"] Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.872744 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.877277 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.878986 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.879504 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.888852 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.888902 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-conf\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.888937 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.888953 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-startup\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.888977 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-sockets\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.889003 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-reloader\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.889024 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb37434f-6f72-4e5c-85f5-5e06f1e07692-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.889044 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbrq\" (UniqueName: \"kubernetes.io/projected/da26606f-a8a1-42e0-b156-9bd538f20c60-kube-api-access-7hbrq\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.889207 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96tc\" (UniqueName: \"kubernetes.io/projected/cb37434f-6f72-4e5c-85f5-5e06f1e07692-kube-api-access-j96tc\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.894000 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8"] Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.951969 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vj7hl"] Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.952870 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vj7hl" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.955448 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.955600 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nprxf" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.955806 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.955828 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.966726 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-w4nm5"] Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.967758 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.973328 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.979500 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-w4nm5"] Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990175 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbrq\" (UniqueName: \"kubernetes.io/projected/da26606f-a8a1-42e0-b156-9bd538f20c60-kube-api-access-7hbrq\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990216 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96tc\" (UniqueName: \"kubernetes.io/projected/cb37434f-6f72-4e5c-85f5-5e06f1e07692-kube-api-access-j96tc\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-cert\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990309 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-conf\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990352 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-startup\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990407 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-sockets\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990423 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zj7c\" (UniqueName: \"kubernetes.io/projected/aaf80919-384d-4751-9ca9-2b9f4994ef1b-kube-api-access-6zj7c\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990448 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hb2z\" (UniqueName: \"kubernetes.io/projected/c5228882-2889-44e6-8a36-db179d19fe25-kube-api-access-2hb2z\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990502 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-reloader\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990521 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metallb-excludel2\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.990536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb37434f-6f72-4e5c-85f5-5e06f1e07692-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:12 crc kubenswrapper[4841]: E1203 17:14:12.990706 4841 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 17:14:12 crc kubenswrapper[4841]: E1203 17:14:12.990778 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs podName:da26606f-a8a1-42e0-b156-9bd538f20c60 nodeName:}" failed. No retries permitted until 2025-12-03 17:14:13.490758554 +0000 UTC m=+847.878279281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs") pod "frr-k8s-r82gk" (UID: "da26606f-a8a1-42e0-b156-9bd538f20c60") : secret "frr-k8s-certs-secret" not found Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.991143 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-sockets\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.991372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-reloader\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.991648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-conf\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.992065 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.992299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/da26606f-a8a1-42e0-b156-9bd538f20c60-frr-startup\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:12 crc kubenswrapper[4841]: I1203 17:14:12.999736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cb37434f-6f72-4e5c-85f5-5e06f1e07692-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.007334 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96tc\" (UniqueName: \"kubernetes.io/projected/cb37434f-6f72-4e5c-85f5-5e06f1e07692-kube-api-access-j96tc\") pod \"frr-k8s-webhook-server-7fcb986d4-cncl8\" (UID: \"cb37434f-6f72-4e5c-85f5-5e06f1e07692\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.014674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbrq\" (UniqueName: \"kubernetes.io/projected/da26606f-a8a1-42e0-b156-9bd538f20c60-kube-api-access-7hbrq\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zj7c\" (UniqueName: \"kubernetes.io/projected/aaf80919-384d-4751-9ca9-2b9f4994ef1b-kube-api-access-6zj7c\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hb2z\" (UniqueName: \"kubernetes.io/projected/c5228882-2889-44e6-8a36-db179d19fe25-kube-api-access-2hb2z\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091476 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metallb-excludel2\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-cert\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.091579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.091693 4841 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.091694 4841 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.091938 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs podName:aaf80919-384d-4751-9ca9-2b9f4994ef1b nodeName:}" failed. No retries permitted until 2025-12-03 17:14:13.591922971 +0000 UTC m=+847.979443698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs") pod "speaker-vj7hl" (UID: "aaf80919-384d-4751-9ca9-2b9f4994ef1b") : secret "speaker-certs-secret" not found Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.091957 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs podName:c5228882-2889-44e6-8a36-db179d19fe25 nodeName:}" failed. No retries permitted until 2025-12-03 17:14:13.591945982 +0000 UTC m=+847.979466699 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs") pod "controller-f8648f98b-w4nm5" (UID: "c5228882-2889-44e6-8a36-db179d19fe25") : secret "controller-certs-secret" not found Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.092395 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metallb-excludel2\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.092476 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.092535 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist podName:aaf80919-384d-4751-9ca9-2b9f4994ef1b nodeName:}" failed. No retries permitted until 2025-12-03 17:14:13.592519646 +0000 UTC m=+847.980040373 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist") pod "speaker-vj7hl" (UID: "aaf80919-384d-4751-9ca9-2b9f4994ef1b") : secret "metallb-memberlist" not found Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.093722 4841 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.108372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-cert\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.110162 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hb2z\" (UniqueName: \"kubernetes.io/projected/c5228882-2889-44e6-8a36-db179d19fe25-kube-api-access-2hb2z\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.111053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zj7c\" (UniqueName: \"kubernetes.io/projected/aaf80919-384d-4751-9ca9-2b9f4994ef1b-kube-api-access-6zj7c\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.186260 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.440571 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8"] Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.496826 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.504194 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/da26606f-a8a1-42e0-b156-9bd538f20c60-metrics-certs\") pod \"frr-k8s-r82gk\" (UID: \"da26606f-a8a1-42e0-b156-9bd538f20c60\") " pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.598441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.598570 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.598655 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.598797 4841 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 17:14:13 crc kubenswrapper[4841]: E1203 17:14:13.598998 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist podName:aaf80919-384d-4751-9ca9-2b9f4994ef1b nodeName:}" failed. No retries permitted until 2025-12-03 17:14:14.598958998 +0000 UTC m=+848.986479765 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist") pod "speaker-vj7hl" (UID: "aaf80919-384d-4751-9ca9-2b9f4994ef1b") : secret "metallb-memberlist" not found Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.602617 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-metrics-certs\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.605992 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c5228882-2889-44e6-8a36-db179d19fe25-metrics-certs\") pod \"controller-f8648f98b-w4nm5\" (UID: \"c5228882-2889-44e6-8a36-db179d19fe25\") " pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.801235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" event={"ID":"cb37434f-6f72-4e5c-85f5-5e06f1e07692","Type":"ContainerStarted","Data":"de9c50012bebe30781bcaffae07536966be9d9588a4f7a0fa7a03e92313ad191"} Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.802941 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:13 crc kubenswrapper[4841]: I1203 17:14:13.883128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.184932 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-w4nm5"] Dec 03 17:14:14 crc kubenswrapper[4841]: W1203 17:14:14.199014 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5228882_2889_44e6_8a36_db179d19fe25.slice/crio-bdb68c69409a142bd9ef52ede0fe5e44a3ac435015c8536427eb9f1590032553 WatchSource:0}: Error finding container bdb68c69409a142bd9ef52ede0fe5e44a3ac435015c8536427eb9f1590032553: Status 404 returned error can't find the container with id bdb68c69409a142bd9ef52ede0fe5e44a3ac435015c8536427eb9f1590032553 Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.617330 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.625654 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/aaf80919-384d-4751-9ca9-2b9f4994ef1b-memberlist\") pod \"speaker-vj7hl\" (UID: \"aaf80919-384d-4751-9ca9-2b9f4994ef1b\") " pod="metallb-system/speaker-vj7hl" Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.765887 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vj7hl" Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.827425 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-w4nm5" event={"ID":"c5228882-2889-44e6-8a36-db179d19fe25","Type":"ContainerStarted","Data":"24b3c19af5cbe89108880862ea81cedcaba69a9290beab8be64966e918f2f4cc"} Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.827480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-w4nm5" event={"ID":"c5228882-2889-44e6-8a36-db179d19fe25","Type":"ContainerStarted","Data":"f9013b3842892b439030a1232c6bc5e2f8f374d5e11dce877587010f2aa9b605"} Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.827511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-w4nm5" event={"ID":"c5228882-2889-44e6-8a36-db179d19fe25","Type":"ContainerStarted","Data":"bdb68c69409a142bd9ef52ede0fe5e44a3ac435015c8536427eb9f1590032553"} Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.828107 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.834205 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"f3de2682add8f98d4032187cf807aa9eb81232b119c86225a1a642f92b3bd003"} Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.838619 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj7hl" event={"ID":"aaf80919-384d-4751-9ca9-2b9f4994ef1b","Type":"ContainerStarted","Data":"86d2322601cd0d5bb91b7d39484ba4a2b5870fe72ac7d50a64029095ce2f1c6e"} Dec 03 17:14:14 crc kubenswrapper[4841]: I1203 17:14:14.852310 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-w4nm5" podStartSLOduration=2.852292201 podStartE2EDuration="2.852292201s" podCreationTimestamp="2025-12-03 17:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:14:14.851159113 +0000 UTC m=+849.238679840" watchObservedRunningTime="2025-12-03 17:14:14.852292201 +0000 UTC m=+849.239812928" Dec 03 17:14:15 crc kubenswrapper[4841]: I1203 17:14:15.847427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj7hl" event={"ID":"aaf80919-384d-4751-9ca9-2b9f4994ef1b","Type":"ContainerStarted","Data":"12cf60be45f272036776db7b5f2c9f1511f2740d97aa3ecdb63c14ce3e301ae3"} Dec 03 17:14:15 crc kubenswrapper[4841]: I1203 17:14:15.847733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vj7hl" event={"ID":"aaf80919-384d-4751-9ca9-2b9f4994ef1b","Type":"ContainerStarted","Data":"c36c5501da26783d217a310f7038b0156ed72cafe3f8ec0cc8d30593c3b6ed63"} Dec 03 17:14:15 crc kubenswrapper[4841]: I1203 17:14:15.866573 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vj7hl" podStartSLOduration=3.8665560279999998 podStartE2EDuration="3.866556028s" podCreationTimestamp="2025-12-03 17:14:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:14:15.862030093 +0000 UTC m=+850.249550830" watchObservedRunningTime="2025-12-03 17:14:15.866556028 +0000 UTC m=+850.254076745" Dec 03 17:14:16 crc kubenswrapper[4841]: I1203 17:14:16.859093 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vj7hl" Dec 03 17:14:22 crc kubenswrapper[4841]: I1203 17:14:22.904189 4841 generic.go:334] "Generic (PLEG): container finished" podID="da26606f-a8a1-42e0-b156-9bd538f20c60" containerID="4f36ba7480fb704f70192c4d7f60111b1c6b187634a7efc4772b0b036e4a7589" exitCode=0 Dec 03 17:14:22 crc kubenswrapper[4841]: I1203 17:14:22.904251 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerDied","Data":"4f36ba7480fb704f70192c4d7f60111b1c6b187634a7efc4772b0b036e4a7589"} Dec 03 17:14:22 crc kubenswrapper[4841]: I1203 17:14:22.906474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" event={"ID":"cb37434f-6f72-4e5c-85f5-5e06f1e07692","Type":"ContainerStarted","Data":"f507ec3dcae19e1b599b8d5d055084aaa5091986f1b3aafc850de0ef536701f2"} Dec 03 17:14:22 crc kubenswrapper[4841]: I1203 17:14:22.906672 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:22 crc kubenswrapper[4841]: I1203 17:14:22.967932 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" podStartSLOduration=2.45621277 podStartE2EDuration="10.967839015s" podCreationTimestamp="2025-12-03 17:14:12 +0000 UTC" firstStartedPulling="2025-12-03 17:14:13.453940428 +0000 UTC m=+847.841461165" lastFinishedPulling="2025-12-03 17:14:21.965566643 +0000 UTC m=+856.353087410" observedRunningTime="2025-12-03 17:14:22.961797262 +0000 UTC m=+857.349317999" watchObservedRunningTime="2025-12-03 17:14:22.967839015 +0000 UTC m=+857.355359742" Dec 03 17:14:23 crc kubenswrapper[4841]: I1203 17:14:23.966172 4841 generic.go:334] "Generic (PLEG): container finished" podID="da26606f-a8a1-42e0-b156-9bd538f20c60" containerID="c8a4ac9aa4a2c23baacb3cd6ee5a15a92f757bf4da1cfa439832945be83ab4aa" exitCode=0 Dec 03 17:14:23 crc kubenswrapper[4841]: I1203 17:14:23.967589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerDied","Data":"c8a4ac9aa4a2c23baacb3cd6ee5a15a92f757bf4da1cfa439832945be83ab4aa"} Dec 03 17:14:24 crc kubenswrapper[4841]: I1203 17:14:24.978997 4841 generic.go:334] "Generic (PLEG): container finished" podID="da26606f-a8a1-42e0-b156-9bd538f20c60" containerID="a21a38523dd036169515528c6e625c6dc00ad9c94f2bf2ad8a0b7244b7c216d2" exitCode=0 Dec 03 17:14:24 crc kubenswrapper[4841]: I1203 17:14:24.979073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerDied","Data":"a21a38523dd036169515528c6e625c6dc00ad9c94f2bf2ad8a0b7244b7c216d2"} Dec 03 17:14:25 crc kubenswrapper[4841]: I1203 17:14:25.994377 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"015192487b637f80090ef52b54055ab523a88597e0af29e873b8a12050ffda74"} Dec 03 17:14:25 crc kubenswrapper[4841]: I1203 17:14:25.995130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"7aba7eaf2cc526deebe9533608ed18a2a237fd6ac7fc6870974a201ba3646eaf"} Dec 03 17:14:25 crc kubenswrapper[4841]: I1203 17:14:25.995148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"9788d2738ba07e54da6a09e8ae70f18bea2882b50e499d8972ff9122d79d916f"} Dec 03 17:14:27 crc kubenswrapper[4841]: I1203 17:14:27.007050 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"0e2a297b133b47621c82d18fd3dcd3c79df8a2dc33f098e3678ff8d576a46260"} Dec 03 17:14:27 crc kubenswrapper[4841]: I1203 17:14:27.007097 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"67c33ed696d19d714952b40a60830b1941016bea1b33424a2dd258f5da198cf2"} Dec 03 17:14:27 crc kubenswrapper[4841]: I1203 17:14:27.007111 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r82gk" event={"ID":"da26606f-a8a1-42e0-b156-9bd538f20c60","Type":"ContainerStarted","Data":"d56164a6fd7096ae43abc4ccb14fc60e0d3e2f689b63d6fe5ee8c9c7460af88b"} Dec 03 17:14:27 crc kubenswrapper[4841]: I1203 17:14:27.007456 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:27 crc kubenswrapper[4841]: I1203 17:14:27.041186 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r82gk" podStartSLOduration=7.334385656 podStartE2EDuration="15.041161717s" podCreationTimestamp="2025-12-03 17:14:12 +0000 UTC" firstStartedPulling="2025-12-03 17:14:14.274731816 +0000 UTC m=+848.662252583" lastFinishedPulling="2025-12-03 17:14:21.981507887 +0000 UTC m=+856.369028644" observedRunningTime="2025-12-03 17:14:27.03421442 +0000 UTC m=+861.421735217" watchObservedRunningTime="2025-12-03 17:14:27.041161717 +0000 UTC m=+861.428682474" Dec 03 17:14:28 crc kubenswrapper[4841]: I1203 17:14:28.804444 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:28 crc kubenswrapper[4841]: I1203 17:14:28.842988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:33 crc kubenswrapper[4841]: I1203 17:14:33.191862 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-cncl8" Dec 03 17:14:33 crc kubenswrapper[4841]: I1203 17:14:33.888801 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-w4nm5" Dec 03 17:14:34 crc kubenswrapper[4841]: I1203 17:14:34.771869 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vj7hl" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.153101 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-ldgf7"] Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.154470 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.157403 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-zqb77" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.157436 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.158019 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.177685 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ldgf7"] Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.245150 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6gtz\" (UniqueName: \"kubernetes.io/projected/dca21fb1-4a3f-4003-a1e2-46c1b191b911-kube-api-access-m6gtz\") pod \"openstack-operator-index-ldgf7\" (UID: \"dca21fb1-4a3f-4003-a1e2-46c1b191b911\") " pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.346200 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m6gtz\" (UniqueName: \"kubernetes.io/projected/dca21fb1-4a3f-4003-a1e2-46c1b191b911-kube-api-access-m6gtz\") pod \"openstack-operator-index-ldgf7\" (UID: \"dca21fb1-4a3f-4003-a1e2-46c1b191b911\") " pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.372059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6gtz\" (UniqueName: \"kubernetes.io/projected/dca21fb1-4a3f-4003-a1e2-46c1b191b911-kube-api-access-m6gtz\") pod \"openstack-operator-index-ldgf7\" (UID: \"dca21fb1-4a3f-4003-a1e2-46c1b191b911\") " pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.475322 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:41 crc kubenswrapper[4841]: I1203 17:14:41.731631 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-ldgf7"] Dec 03 17:14:42 crc kubenswrapper[4841]: I1203 17:14:42.114396 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ldgf7" event={"ID":"dca21fb1-4a3f-4003-a1e2-46c1b191b911","Type":"ContainerStarted","Data":"4e59c7937b528ee5415415de4900cfbca4cfadb64294796789c975df76da962e"} Dec 03 17:14:43 crc kubenswrapper[4841]: I1203 17:14:43.807793 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r82gk" Dec 03 17:14:46 crc kubenswrapper[4841]: I1203 17:14:46.155383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-ldgf7" event={"ID":"dca21fb1-4a3f-4003-a1e2-46c1b191b911","Type":"ContainerStarted","Data":"eb648b8360d9591b23e4704dc8c82cd658b0b65d6e505f49df9fd3dae179e9bb"} Dec 03 17:14:46 crc kubenswrapper[4841]: I1203 17:14:46.178887 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-ldgf7" podStartSLOduration=1.3136775 podStartE2EDuration="5.178861721s" podCreationTimestamp="2025-12-03 17:14:41 +0000 UTC" firstStartedPulling="2025-12-03 17:14:41.751098085 +0000 UTC m=+876.138618812" lastFinishedPulling="2025-12-03 17:14:45.616282306 +0000 UTC m=+880.003803033" observedRunningTime="2025-12-03 17:14:46.175519237 +0000 UTC m=+880.563040034" watchObservedRunningTime="2025-12-03 17:14:46.178861721 +0000 UTC m=+880.566382488" Dec 03 17:14:51 crc kubenswrapper[4841]: I1203 17:14:51.475957 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:51 crc kubenswrapper[4841]: I1203 17:14:51.476571 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:51 crc kubenswrapper[4841]: I1203 17:14:51.508572 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:52 crc kubenswrapper[4841]: I1203 17:14:52.399505 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-ldgf7" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.600661 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf"] Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.603352 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.609530 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-gjq46" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.622215 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf"] Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.700806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.700860 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfzhp\" (UniqueName: \"kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.700991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.802002 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.802058 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfzhp\" (UniqueName: \"kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.802133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.803118 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.803265 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.826776 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfzhp\" (UniqueName: \"kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp\") pod \"89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:54 crc kubenswrapper[4841]: I1203 17:14:54.940051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:55 crc kubenswrapper[4841]: I1203 17:14:55.447106 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf"] Dec 03 17:14:56 crc kubenswrapper[4841]: I1203 17:14:56.410854 4841 generic.go:334] "Generic (PLEG): container finished" podID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerID="309507bb452ca00284da3e2bda579714271eec54345eeddcc0d10081c370f588" exitCode=0 Dec 03 17:14:56 crc kubenswrapper[4841]: I1203 17:14:56.410938 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" event={"ID":"dd47aeca-76ee-41e9-8707-43067a97d9ff","Type":"ContainerDied","Data":"309507bb452ca00284da3e2bda579714271eec54345eeddcc0d10081c370f588"} Dec 03 17:14:56 crc kubenswrapper[4841]: I1203 17:14:56.411171 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" event={"ID":"dd47aeca-76ee-41e9-8707-43067a97d9ff","Type":"ContainerStarted","Data":"630019bed570f49cf53139f88b5ca35e1765123fa24c8f9c408c1fd6235aecf3"} Dec 03 17:14:57 crc kubenswrapper[4841]: I1203 17:14:57.419793 4841 generic.go:334] "Generic (PLEG): container finished" podID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerID="610315dea61e8d848fa2212cc1a5169dc7e9d1322db21b04524a2c44a1350365" exitCode=0 Dec 03 17:14:57 crc kubenswrapper[4841]: I1203 17:14:57.419928 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" event={"ID":"dd47aeca-76ee-41e9-8707-43067a97d9ff","Type":"ContainerDied","Data":"610315dea61e8d848fa2212cc1a5169dc7e9d1322db21b04524a2c44a1350365"} Dec 03 17:14:58 crc kubenswrapper[4841]: I1203 17:14:58.432765 4841 generic.go:334] "Generic (PLEG): container finished" podID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerID="0d6b62af71b2725f594b18c7c369101ef91ca2714518dd0f2be60e32f1fd1745" exitCode=0 Dec 03 17:14:58 crc kubenswrapper[4841]: I1203 17:14:58.432843 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" event={"ID":"dd47aeca-76ee-41e9-8707-43067a97d9ff","Type":"ContainerDied","Data":"0d6b62af71b2725f594b18c7c369101ef91ca2714518dd0f2be60e32f1fd1745"} Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.771529 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.875058 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle\") pod \"dd47aeca-76ee-41e9-8707-43067a97d9ff\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.875145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util\") pod \"dd47aeca-76ee-41e9-8707-43067a97d9ff\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.875225 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfzhp\" (UniqueName: \"kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp\") pod \"dd47aeca-76ee-41e9-8707-43067a97d9ff\" (UID: \"dd47aeca-76ee-41e9-8707-43067a97d9ff\") " Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.876284 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle" (OuterVolumeSpecName: "bundle") pod "dd47aeca-76ee-41e9-8707-43067a97d9ff" (UID: "dd47aeca-76ee-41e9-8707-43067a97d9ff"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.883863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp" (OuterVolumeSpecName: "kube-api-access-sfzhp") pod "dd47aeca-76ee-41e9-8707-43067a97d9ff" (UID: "dd47aeca-76ee-41e9-8707-43067a97d9ff"). InnerVolumeSpecName "kube-api-access-sfzhp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.887980 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util" (OuterVolumeSpecName: "util") pod "dd47aeca-76ee-41e9-8707-43067a97d9ff" (UID: "dd47aeca-76ee-41e9-8707-43067a97d9ff"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.976488 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-util\") on node \"crc\" DevicePath \"\"" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.976548 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfzhp\" (UniqueName: \"kubernetes.io/projected/dd47aeca-76ee-41e9-8707-43067a97d9ff-kube-api-access-sfzhp\") on node \"crc\" DevicePath \"\"" Dec 03 17:14:59 crc kubenswrapper[4841]: I1203 17:14:59.976569 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd47aeca-76ee-41e9-8707-43067a97d9ff-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.158467 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt"] Dec 03 17:15:00 crc kubenswrapper[4841]: E1203 17:15:00.158709 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="extract" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.158721 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="extract" Dec 03 17:15:00 crc kubenswrapper[4841]: E1203 17:15:00.158735 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="util" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.158741 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="util" Dec 03 17:15:00 crc kubenswrapper[4841]: E1203 17:15:00.158755 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="pull" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.158761 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="pull" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.158879 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd47aeca-76ee-41e9-8707-43067a97d9ff" containerName="extract" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.159331 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.161848 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.162597 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.179371 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt"] Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.279260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.279396 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.279446 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrs4p\" (UniqueName: \"kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.381217 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.381273 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.381298 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrs4p\" (UniqueName: \"kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.383263 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.388997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.401829 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrs4p\" (UniqueName: \"kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p\") pod \"collect-profiles-29413035-w5flt\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.450415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" event={"ID":"dd47aeca-76ee-41e9-8707-43067a97d9ff","Type":"ContainerDied","Data":"630019bed570f49cf53139f88b5ca35e1765123fa24c8f9c408c1fd6235aecf3"} Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.450459 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="630019bed570f49cf53139f88b5ca35e1765123fa24c8f9c408c1fd6235aecf3" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.450552 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.477265 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:00 crc kubenswrapper[4841]: I1203 17:15:00.941240 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt"] Dec 03 17:15:01 crc kubenswrapper[4841]: I1203 17:15:01.468788 4841 generic.go:334] "Generic (PLEG): container finished" podID="480e0b95-5535-4d4b-9261-8b0e5ea621ef" containerID="19201c2b3a5aa5b4fec97f0b84eb404e483bfc55d6dbc47b93a926c59973a213" exitCode=0 Dec 03 17:15:01 crc kubenswrapper[4841]: I1203 17:15:01.469014 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" event={"ID":"480e0b95-5535-4d4b-9261-8b0e5ea621ef","Type":"ContainerDied","Data":"19201c2b3a5aa5b4fec97f0b84eb404e483bfc55d6dbc47b93a926c59973a213"} Dec 03 17:15:01 crc kubenswrapper[4841]: I1203 17:15:01.469138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" event={"ID":"480e0b95-5535-4d4b-9261-8b0e5ea621ef","Type":"ContainerStarted","Data":"ce96d1fbb2d36e495de69269b92c432339b6c37da8d1f8f179b762e1446e820f"} Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.774896 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.816895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume\") pod \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.816987 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume\") pod \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.817025 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrs4p\" (UniqueName: \"kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p\") pod \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\" (UID: \"480e0b95-5535-4d4b-9261-8b0e5ea621ef\") " Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.820496 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "480e0b95-5535-4d4b-9261-8b0e5ea621ef" (UID: "480e0b95-5535-4d4b-9261-8b0e5ea621ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.840725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p" (OuterVolumeSpecName: "kube-api-access-hrs4p") pod "480e0b95-5535-4d4b-9261-8b0e5ea621ef" (UID: "480e0b95-5535-4d4b-9261-8b0e5ea621ef"). InnerVolumeSpecName "kube-api-access-hrs4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.851502 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "480e0b95-5535-4d4b-9261-8b0e5ea621ef" (UID: "480e0b95-5535-4d4b-9261-8b0e5ea621ef"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.919101 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/480e0b95-5535-4d4b-9261-8b0e5ea621ef-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.919161 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/480e0b95-5535-4d4b-9261-8b0e5ea621ef-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:02 crc kubenswrapper[4841]: I1203 17:15:02.919183 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrs4p\" (UniqueName: \"kubernetes.io/projected/480e0b95-5535-4d4b-9261-8b0e5ea621ef-kube-api-access-hrs4p\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.457117 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc"] Dec 03 17:15:03 crc kubenswrapper[4841]: E1203 17:15:03.457593 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="480e0b95-5535-4d4b-9261-8b0e5ea621ef" containerName="collect-profiles" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.457605 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="480e0b95-5535-4d4b-9261-8b0e5ea621ef" containerName="collect-profiles" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.457725 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="480e0b95-5535-4d4b-9261-8b0e5ea621ef" containerName="collect-profiles" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.458103 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.460418 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-gd6d8" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.480644 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" event={"ID":"480e0b95-5535-4d4b-9261-8b0e5ea621ef","Type":"ContainerDied","Data":"ce96d1fbb2d36e495de69269b92c432339b6c37da8d1f8f179b762e1446e820f"} Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.480679 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce96d1fbb2d36e495de69269b92c432339b6c37da8d1f8f179b762e1446e820f" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.480730 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.493462 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc"] Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.526285 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4j4k\" (UniqueName: \"kubernetes.io/projected/91effd10-805a-48e2-a65a-529fe1e33a37-kube-api-access-r4j4k\") pod \"openstack-operator-controller-operator-6fdc9d4685-8lhfc\" (UID: \"91effd10-805a-48e2-a65a-529fe1e33a37\") " pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.627654 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4j4k\" (UniqueName: \"kubernetes.io/projected/91effd10-805a-48e2-a65a-529fe1e33a37-kube-api-access-r4j4k\") pod \"openstack-operator-controller-operator-6fdc9d4685-8lhfc\" (UID: \"91effd10-805a-48e2-a65a-529fe1e33a37\") " pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.646276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4j4k\" (UniqueName: \"kubernetes.io/projected/91effd10-805a-48e2-a65a-529fe1e33a37-kube-api-access-r4j4k\") pod \"openstack-operator-controller-operator-6fdc9d4685-8lhfc\" (UID: \"91effd10-805a-48e2-a65a-529fe1e33a37\") " pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:03 crc kubenswrapper[4841]: I1203 17:15:03.776979 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:04 crc kubenswrapper[4841]: I1203 17:15:04.021598 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc"] Dec 03 17:15:04 crc kubenswrapper[4841]: W1203 17:15:04.032487 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91effd10_805a_48e2_a65a_529fe1e33a37.slice/crio-80aff357e2821c5efec9399ab9266148b685bfcf54a1e4fca90c0f39e3244c08 WatchSource:0}: Error finding container 80aff357e2821c5efec9399ab9266148b685bfcf54a1e4fca90c0f39e3244c08: Status 404 returned error can't find the container with id 80aff357e2821c5efec9399ab9266148b685bfcf54a1e4fca90c0f39e3244c08 Dec 03 17:15:04 crc kubenswrapper[4841]: I1203 17:15:04.486052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" event={"ID":"91effd10-805a-48e2-a65a-529fe1e33a37","Type":"ContainerStarted","Data":"80aff357e2821c5efec9399ab9266148b685bfcf54a1e4fca90c0f39e3244c08"} Dec 03 17:15:11 crc kubenswrapper[4841]: I1203 17:15:11.561021 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" event={"ID":"91effd10-805a-48e2-a65a-529fe1e33a37","Type":"ContainerStarted","Data":"92d0a22f8db53a8b64c63f1d302760ffde8194abe0177c7406e486e04ccf81c3"} Dec 03 17:15:11 crc kubenswrapper[4841]: I1203 17:15:11.561694 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:11 crc kubenswrapper[4841]: I1203 17:15:11.596369 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" podStartSLOduration=2.097814715 podStartE2EDuration="8.596353826s" podCreationTimestamp="2025-12-03 17:15:03 +0000 UTC" firstStartedPulling="2025-12-03 17:15:04.036046952 +0000 UTC m=+898.423567689" lastFinishedPulling="2025-12-03 17:15:10.534586073 +0000 UTC m=+904.922106800" observedRunningTime="2025-12-03 17:15:11.593641148 +0000 UTC m=+905.981161885" watchObservedRunningTime="2025-12-03 17:15:11.596353826 +0000 UTC m=+905.983874553" Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.764436 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.766009 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.778162 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.919419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.919572 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjwcv\" (UniqueName: \"kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:20 crc kubenswrapper[4841]: I1203 17:15:20.919739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.021159 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.021227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.021308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjwcv\" (UniqueName: \"kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.021855 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.022037 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.045449 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjwcv\" (UniqueName: \"kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv\") pod \"community-operators-tmfm4\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.121333 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.577966 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:21 crc kubenswrapper[4841]: W1203 17:15:21.585137 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107ead80_d9cb_4fbb_bb14_dbdb54912a2f.slice/crio-a3fccc5cbeb8c2c3f72d861bd80ce3b3571903196a233548a852651758b1b45f WatchSource:0}: Error finding container a3fccc5cbeb8c2c3f72d861bd80ce3b3571903196a233548a852651758b1b45f: Status 404 returned error can't find the container with id a3fccc5cbeb8c2c3f72d861bd80ce3b3571903196a233548a852651758b1b45f Dec 03 17:15:21 crc kubenswrapper[4841]: I1203 17:15:21.621879 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerStarted","Data":"a3fccc5cbeb8c2c3f72d861bd80ce3b3571903196a233548a852651758b1b45f"} Dec 03 17:15:23 crc kubenswrapper[4841]: I1203 17:15:23.780808 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6fdc9d4685-8lhfc" Dec 03 17:15:24 crc kubenswrapper[4841]: I1203 17:15:24.642263 4841 generic.go:334] "Generic (PLEG): container finished" podID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerID="9ada7b5596ebdf7d0e22d7892d456c614845aedd0a04b25ff0bb1bd59be3ea37" exitCode=0 Dec 03 17:15:24 crc kubenswrapper[4841]: I1203 17:15:24.642505 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerDied","Data":"9ada7b5596ebdf7d0e22d7892d456c614845aedd0a04b25ff0bb1bd59be3ea37"} Dec 03 17:15:25 crc kubenswrapper[4841]: I1203 17:15:25.651040 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerStarted","Data":"6094f5e4674be0094ecaa2753b36722ba36afd2461b023eb2824604f13260da7"} Dec 03 17:15:26 crc kubenswrapper[4841]: I1203 17:15:26.660826 4841 generic.go:334] "Generic (PLEG): container finished" podID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerID="6094f5e4674be0094ecaa2753b36722ba36afd2461b023eb2824604f13260da7" exitCode=0 Dec 03 17:15:26 crc kubenswrapper[4841]: I1203 17:15:26.660886 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerDied","Data":"6094f5e4674be0094ecaa2753b36722ba36afd2461b023eb2824604f13260da7"} Dec 03 17:15:27 crc kubenswrapper[4841]: I1203 17:15:27.677038 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerStarted","Data":"01b7ebc71bb91d0a18f85da87def58824199c8b68661a7d65ccefe5d0127ac69"} Dec 03 17:15:27 crc kubenswrapper[4841]: I1203 17:15:27.700624 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tmfm4" podStartSLOduration=5.291420216 podStartE2EDuration="7.700602873s" podCreationTimestamp="2025-12-03 17:15:20 +0000 UTC" firstStartedPulling="2025-12-03 17:15:24.643706261 +0000 UTC m=+919.031226988" lastFinishedPulling="2025-12-03 17:15:27.052888918 +0000 UTC m=+921.440409645" observedRunningTime="2025-12-03 17:15:27.697718295 +0000 UTC m=+922.085239032" watchObservedRunningTime="2025-12-03 17:15:27.700602873 +0000 UTC m=+922.088123600" Dec 03 17:15:31 crc kubenswrapper[4841]: I1203 17:15:31.121450 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:31 crc kubenswrapper[4841]: I1203 17:15:31.121796 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:31 crc kubenswrapper[4841]: I1203 17:15:31.160778 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:39 crc kubenswrapper[4841]: I1203 17:15:39.316393 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:15:39 crc kubenswrapper[4841]: I1203 17:15:39.317011 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:15:41 crc kubenswrapper[4841]: I1203 17:15:41.158715 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:41 crc kubenswrapper[4841]: I1203 17:15:41.196990 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:41 crc kubenswrapper[4841]: I1203 17:15:41.750830 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tmfm4" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="registry-server" containerID="cri-o://01b7ebc71bb91d0a18f85da87def58824199c8b68661a7d65ccefe5d0127ac69" gracePeriod=2 Dec 03 17:15:42 crc kubenswrapper[4841]: I1203 17:15:42.757582 4841 generic.go:334] "Generic (PLEG): container finished" podID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerID="01b7ebc71bb91d0a18f85da87def58824199c8b68661a7d65ccefe5d0127ac69" exitCode=0 Dec 03 17:15:42 crc kubenswrapper[4841]: I1203 17:15:42.757656 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerDied","Data":"01b7ebc71bb91d0a18f85da87def58824199c8b68661a7d65ccefe5d0127ac69"} Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.044936 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.045828 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.048333 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-spgmt" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.066580 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.081022 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.082345 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.087272 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-qgb7p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.087341 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.088541 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.089744 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-tfdkx" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.097438 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.098988 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.100704 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-b47xq" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.110255 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.117052 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.122256 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.123469 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.125687 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-v55xd" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.130718 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kpjw\" (UniqueName: \"kubernetes.io/projected/c969bc4d-df07-4ec7-b406-7de0710faca8-kube-api-access-4kpjw\") pod \"glance-operator-controller-manager-77987cd8cd-jmzr6\" (UID: \"c969bc4d-df07-4ec7-b406-7de0710faca8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.130808 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kr8lx\" (UniqueName: \"kubernetes.io/projected/a6ef72b8-96de-4545-9100-081f42138dff-kube-api-access-kr8lx\") pod \"barbican-operator-controller-manager-7d9dfd778-hwp8p\" (UID: \"a6ef72b8-96de-4545-9100-081f42138dff\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.130872 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kxh\" (UniqueName: \"kubernetes.io/projected/6dbdda39-de04-49e2-8667-58eb77b076b9-kube-api-access-c5kxh\") pod \"cinder-operator-controller-manager-859b6ccc6-qgzs7\" (UID: \"6dbdda39-de04-49e2-8667-58eb77b076b9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.145129 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.152194 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.204845 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.217502 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.224035 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-z6xqs" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.235291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kr8lx\" (UniqueName: \"kubernetes.io/projected/a6ef72b8-96de-4545-9100-081f42138dff-kube-api-access-kr8lx\") pod \"barbican-operator-controller-manager-7d9dfd778-hwp8p\" (UID: \"a6ef72b8-96de-4545-9100-081f42138dff\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.239973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kxh\" (UniqueName: \"kubernetes.io/projected/6dbdda39-de04-49e2-8667-58eb77b076b9-kube-api-access-c5kxh\") pod \"cinder-operator-controller-manager-859b6ccc6-qgzs7\" (UID: \"6dbdda39-de04-49e2-8667-58eb77b076b9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.240017 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4d4v\" (UniqueName: \"kubernetes.io/projected/d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8-kube-api-access-w4d4v\") pod \"designate-operator-controller-manager-78b4bc895b-bfvv8\" (UID: \"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.240058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58clf\" (UniqueName: \"kubernetes.io/projected/cae5c7a3-2395-4cfe-93f2-5a7301c52444-kube-api-access-58clf\") pod \"heat-operator-controller-manager-5f64f6f8bb-z8zb5\" (UID: \"cae5c7a3-2395-4cfe-93f2-5a7301c52444\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.240102 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kpjw\" (UniqueName: \"kubernetes.io/projected/c969bc4d-df07-4ec7-b406-7de0710faca8-kube-api-access-4kpjw\") pod \"glance-operator-controller-manager-77987cd8cd-jmzr6\" (UID: \"c969bc4d-df07-4ec7-b406-7de0710faca8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.273230 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.275098 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kr8lx\" (UniqueName: \"kubernetes.io/projected/a6ef72b8-96de-4545-9100-081f42138dff-kube-api-access-kr8lx\") pod \"barbican-operator-controller-manager-7d9dfd778-hwp8p\" (UID: \"a6ef72b8-96de-4545-9100-081f42138dff\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.279302 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kpjw\" (UniqueName: \"kubernetes.io/projected/c969bc4d-df07-4ec7-b406-7de0710faca8-kube-api-access-4kpjw\") pod \"glance-operator-controller-manager-77987cd8cd-jmzr6\" (UID: \"c969bc4d-df07-4ec7-b406-7de0710faca8\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.285383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kxh\" (UniqueName: \"kubernetes.io/projected/6dbdda39-de04-49e2-8667-58eb77b076b9-kube-api-access-c5kxh\") pod \"cinder-operator-controller-manager-859b6ccc6-qgzs7\" (UID: \"6dbdda39-de04-49e2-8667-58eb77b076b9\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.302071 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.302201 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.309563 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-4j25s" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.318374 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.337769 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.340333 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-86gvr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.341226 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qgnp\" (UniqueName: \"kubernetes.io/projected/6384ded0-4512-4d89-bef4-004339bb019d-kube-api-access-9qgnp\") pod \"ironic-operator-controller-manager-6c548fd776-jfddn\" (UID: \"6384ded0-4512-4d89-bef4-004339bb019d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.341282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4d4v\" (UniqueName: \"kubernetes.io/projected/d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8-kube-api-access-w4d4v\") pod \"designate-operator-controller-manager-78b4bc895b-bfvv8\" (UID: \"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.341317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58clf\" (UniqueName: \"kubernetes.io/projected/cae5c7a3-2395-4cfe-93f2-5a7301c52444-kube-api-access-58clf\") pod \"heat-operator-controller-manager-5f64f6f8bb-z8zb5\" (UID: \"cae5c7a3-2395-4cfe-93f2-5a7301c52444\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.341369 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8htq5\" (UniqueName: \"kubernetes.io/projected/096189c4-aa40-4a3d-b8df-f8dbfa674e08-kube-api-access-8htq5\") pod \"horizon-operator-controller-manager-68c6d99b8f-8jcwp\" (UID: \"096189c4-aa40-4a3d-b8df-f8dbfa674e08\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.349823 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.369644 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.370727 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.372363 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.387341 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.387530 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-8ck4z" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.404213 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4d4v\" (UniqueName: \"kubernetes.io/projected/d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8-kube-api-access-w4d4v\") pod \"designate-operator-controller-manager-78b4bc895b-bfvv8\" (UID: \"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.407241 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.408258 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.411726 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8wldw" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.421256 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.426676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58clf\" (UniqueName: \"kubernetes.io/projected/cae5c7a3-2395-4cfe-93f2-5a7301c52444-kube-api-access-58clf\") pod \"heat-operator-controller-manager-5f64f6f8bb-z8zb5\" (UID: \"cae5c7a3-2395-4cfe-93f2-5a7301c52444\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444562 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8htq5\" (UniqueName: \"kubernetes.io/projected/096189c4-aa40-4a3d-b8df-f8dbfa674e08-kube-api-access-8htq5\") pod \"horizon-operator-controller-manager-68c6d99b8f-8jcwp\" (UID: \"096189c4-aa40-4a3d-b8df-f8dbfa674e08\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444644 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qgnp\" (UniqueName: \"kubernetes.io/projected/6384ded0-4512-4d89-bef4-004339bb019d-kube-api-access-9qgnp\") pod \"ironic-operator-controller-manager-6c548fd776-jfddn\" (UID: \"6384ded0-4512-4d89-bef4-004339bb019d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444687 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5t5j\" (UniqueName: \"kubernetes.io/projected/0dda1581-f45b-42cd-840f-9b8f2d7a48b1-kube-api-access-k5t5j\") pod \"keystone-operator-controller-manager-7765d96ddf-fj8w7\" (UID: \"0dda1581-f45b-42cd-840f-9b8f2d7a48b1\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444710 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf562\" (UniqueName: \"kubernetes.io/projected/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-kube-api-access-qf562\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.444729 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzj2\" (UniqueName: \"kubernetes.io/projected/abd88bfa-5c17-4486-a051-50c1ceaafe60-kube-api-access-rlzj2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-t9pmr\" (UID: \"abd88bfa-5c17-4486-a051-50c1ceaafe60\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.445230 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.452868 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.455078 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.476547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8htq5\" (UniqueName: \"kubernetes.io/projected/096189c4-aa40-4a3d-b8df-f8dbfa674e08-kube-api-access-8htq5\") pod \"horizon-operator-controller-manager-68c6d99b8f-8jcwp\" (UID: \"096189c4-aa40-4a3d-b8df-f8dbfa674e08\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.489230 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.490205 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qgnp\" (UniqueName: \"kubernetes.io/projected/6384ded0-4512-4d89-bef4-004339bb019d-kube-api-access-9qgnp\") pod \"ironic-operator-controller-manager-6c548fd776-jfddn\" (UID: \"6384ded0-4512-4d89-bef4-004339bb019d\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.515982 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.523961 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.529174 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.530189 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.533673 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-zwd9x" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.539398 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.540437 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.541746 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.545526 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.545622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5t5j\" (UniqueName: \"kubernetes.io/projected/0dda1581-f45b-42cd-840f-9b8f2d7a48b1-kube-api-access-k5t5j\") pod \"keystone-operator-controller-manager-7765d96ddf-fj8w7\" (UID: \"0dda1581-f45b-42cd-840f-9b8f2d7a48b1\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.545643 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf562\" (UniqueName: \"kubernetes.io/projected/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-kube-api-access-qf562\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.545662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzj2\" (UniqueName: \"kubernetes.io/projected/abd88bfa-5c17-4486-a051-50c1ceaafe60-kube-api-access-rlzj2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-t9pmr\" (UID: \"abd88bfa-5c17-4486-a051-50c1ceaafe60\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.545866 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz"] Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.546234 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.546276 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert podName:b9bdf600-ace4-4f28-80c9-3dd36cf449ad nodeName:}" failed. No retries permitted until 2025-12-03 17:15:44.046259587 +0000 UTC m=+938.433780314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert") pod "infra-operator-controller-manager-57548d458d-s2b4r" (UID: "b9bdf600-ace4-4f28-80c9-3dd36cf449ad") : secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.557294 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l56t4"] Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.557599 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="registry-server" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.557614 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="registry-server" Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.557630 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="extract-content" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.557637 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="extract-content" Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.557657 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="extract-utilities" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.557664 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="extract-utilities" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.557797 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" containerName="registry-server" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.558565 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.562200 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-2nxns" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.562780 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-z8xv7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.563080 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.570363 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.571648 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.576795 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-g7v4f" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.578837 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.579539 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzj2\" (UniqueName: \"kubernetes.io/projected/abd88bfa-5c17-4486-a051-50c1ceaafe60-kube-api-access-rlzj2\") pod \"mariadb-operator-controller-manager-56bbcc9d85-t9pmr\" (UID: \"abd88bfa-5c17-4486-a051-50c1ceaafe60\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.594682 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l56t4"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.596214 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.603683 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf562\" (UniqueName: \"kubernetes.io/projected/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-kube-api-access-qf562\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.604714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5t5j\" (UniqueName: \"kubernetes.io/projected/0dda1581-f45b-42cd-840f-9b8f2d7a48b1-kube-api-access-k5t5j\") pod \"keystone-operator-controller-manager-7765d96ddf-fj8w7\" (UID: \"0dda1581-f45b-42cd-840f-9b8f2d7a48b1\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.604778 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.606488 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.608037 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.609673 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.609859 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-5mvvc" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.610038 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.610773 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.614781 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-ddxgm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.617934 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q9b79"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.619163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.622103 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-zbzst" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.646788 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content\") pod \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjwcv\" (UniqueName: \"kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv\") pod \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647162 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities\") pod \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\" (UID: \"107ead80-d9cb-4fbb-bb14-dbdb54912a2f\") " Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647413 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7f4\" (UniqueName: \"kubernetes.io/projected/7b153ea5-5794-46c6-a3f3-099b3b45dfef-kube-api-access-dk7f4\") pod \"ovn-operator-controller-manager-b6456fdb6-2d7g6\" (UID: \"7b153ea5-5794-46c6-a3f3-099b3b45dfef\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647445 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4dl8\" (UniqueName: \"kubernetes.io/projected/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-kube-api-access-k4dl8\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6vg2\" (UniqueName: \"kubernetes.io/projected/36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb-kube-api-access-r6vg2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-627sl\" (UID: \"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw6dt\" (UniqueName: \"kubernetes.io/projected/a253f1cc-d669-490e-9bf4-aff2e95347b0-kube-api-access-kw6dt\") pod \"nova-operator-controller-manager-697bc559fc-sl5jm\" (UID: \"a253f1cc-d669-490e-9bf4-aff2e95347b0\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647588 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsl4d\" (UniqueName: \"kubernetes.io/projected/b24334e0-1dd6-4667-8ce1-6013cc71dd7f-kube-api-access-wsl4d\") pod \"octavia-operator-controller-manager-998648c74-l56t4\" (UID: \"b24334e0-1dd6-4667-8ce1-6013cc71dd7f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.647688 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsr8h\" (UniqueName: \"kubernetes.io/projected/38a95f1c-87ae-4464-b6fa-ad329d17290e-kube-api-access-jsr8h\") pod \"manila-operator-controller-manager-7c79b5df47-khncz\" (UID: \"38a95f1c-87ae-4464-b6fa-ad329d17290e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.652400 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.654335 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities" (OuterVolumeSpecName: "utilities") pod "107ead80-d9cb-4fbb-bb14-dbdb54912a2f" (UID: "107ead80-d9cb-4fbb-bb14-dbdb54912a2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.658011 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.662716 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv" (OuterVolumeSpecName: "kube-api-access-sjwcv") pod "107ead80-d9cb-4fbb-bb14-dbdb54912a2f" (UID: "107ead80-d9cb-4fbb-bb14-dbdb54912a2f"). InnerVolumeSpecName "kube-api-access-sjwcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.674636 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.675699 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.679162 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q9b79"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.685845 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-wpdpd" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.693830 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.724911 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.754788 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7f4\" (UniqueName: \"kubernetes.io/projected/7b153ea5-5794-46c6-a3f3-099b3b45dfef-kube-api-access-dk7f4\") pod \"ovn-operator-controller-manager-b6456fdb6-2d7g6\" (UID: \"7b153ea5-5794-46c6-a3f3-099b3b45dfef\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.754833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4dl8\" (UniqueName: \"kubernetes.io/projected/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-kube-api-access-k4dl8\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.754923 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t4v8\" (UniqueName: \"kubernetes.io/projected/70e46d25-a5c6-49b4-b3d5-0828bc234644-kube-api-access-7t4v8\") pod \"swift-operator-controller-manager-5f8c65bbfc-km89l\" (UID: \"70e46d25-a5c6-49b4-b3d5-0828bc234644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.754965 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6vg2\" (UniqueName: \"kubernetes.io/projected/36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb-kube-api-access-r6vg2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-627sl\" (UID: \"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755025 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw6dt\" (UniqueName: \"kubernetes.io/projected/a253f1cc-d669-490e-9bf4-aff2e95347b0-kube-api-access-kw6dt\") pod \"nova-operator-controller-manager-697bc559fc-sl5jm\" (UID: \"a253f1cc-d669-490e-9bf4-aff2e95347b0\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755115 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsl4d\" (UniqueName: \"kubernetes.io/projected/b24334e0-1dd6-4667-8ce1-6013cc71dd7f-kube-api-access-wsl4d\") pod \"octavia-operator-controller-manager-998648c74-l56t4\" (UID: \"b24334e0-1dd6-4667-8ce1-6013cc71dd7f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755178 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmk7v\" (UniqueName: \"kubernetes.io/projected/a4eccc19-eb01-4b44-99e0-041144e4b409-kube-api-access-fmk7v\") pod \"placement-operator-controller-manager-78f8948974-q9b79\" (UID: \"a4eccc19-eb01-4b44-99e0-041144e4b409\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755288 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsr8h\" (UniqueName: \"kubernetes.io/projected/38a95f1c-87ae-4464-b6fa-ad329d17290e-kube-api-access-jsr8h\") pod \"manila-operator-controller-manager-7c79b5df47-khncz\" (UID: \"38a95f1c-87ae-4464-b6fa-ad329d17290e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755506 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.755533 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjwcv\" (UniqueName: \"kubernetes.io/projected/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-kube-api-access-sjwcv\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.757806 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:43 crc kubenswrapper[4841]: E1203 17:15:43.757862 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert podName:7e01626f-e7f3-4c48-bc9b-5d9261b3d89a nodeName:}" failed. No retries permitted until 2025-12-03 17:15:44.257845468 +0000 UTC m=+938.645366195 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" (UID: "7e01626f-e7f3-4c48-bc9b-5d9261b3d89a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.762834 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.764415 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.783247 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7f4\" (UniqueName: \"kubernetes.io/projected/7b153ea5-5794-46c6-a3f3-099b3b45dfef-kube-api-access-dk7f4\") pod \"ovn-operator-controller-manager-b6456fdb6-2d7g6\" (UID: \"7b153ea5-5794-46c6-a3f3-099b3b45dfef\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.783524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "107ead80-d9cb-4fbb-bb14-dbdb54912a2f" (UID: "107ead80-d9cb-4fbb-bb14-dbdb54912a2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.791873 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsl4d\" (UniqueName: \"kubernetes.io/projected/b24334e0-1dd6-4667-8ce1-6013cc71dd7f-kube-api-access-wsl4d\") pod \"octavia-operator-controller-manager-998648c74-l56t4\" (UID: \"b24334e0-1dd6-4667-8ce1-6013cc71dd7f\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.793248 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.804619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4dl8\" (UniqueName: \"kubernetes.io/projected/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-kube-api-access-k4dl8\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.846621 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.854590 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.855262 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6vg2\" (UniqueName: \"kubernetes.io/projected/36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb-kube-api-access-r6vg2\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-627sl\" (UID: \"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.857766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw6dt\" (UniqueName: \"kubernetes.io/projected/a253f1cc-d669-490e-9bf4-aff2e95347b0-kube-api-access-kw6dt\") pod \"nova-operator-controller-manager-697bc559fc-sl5jm\" (UID: \"a253f1cc-d669-490e-9bf4-aff2e95347b0\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.859621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmk7v\" (UniqueName: \"kubernetes.io/projected/a4eccc19-eb01-4b44-99e0-041144e4b409-kube-api-access-fmk7v\") pod \"placement-operator-controller-manager-78f8948974-q9b79\" (UID: \"a4eccc19-eb01-4b44-99e0-041144e4b409\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.859653 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.859734 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-bxh6d" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.859744 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t4v8\" (UniqueName: \"kubernetes.io/projected/70e46d25-a5c6-49b4-b3d5-0828bc234644-kube-api-access-7t4v8\") pod \"swift-operator-controller-manager-5f8c65bbfc-km89l\" (UID: \"70e46d25-a5c6-49b4-b3d5-0828bc234644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.859826 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107ead80-d9cb-4fbb-bb14-dbdb54912a2f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.861530 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.869470 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-28fb6" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.912761 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.914173 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.918527 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmk7v\" (UniqueName: \"kubernetes.io/projected/a4eccc19-eb01-4b44-99e0-041144e4b409-kube-api-access-fmk7v\") pod \"placement-operator-controller-manager-78f8948974-q9b79\" (UID: \"a4eccc19-eb01-4b44-99e0-041144e4b409\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.918557 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsr8h\" (UniqueName: \"kubernetes.io/projected/38a95f1c-87ae-4464-b6fa-ad329d17290e-kube-api-access-jsr8h\") pod \"manila-operator-controller-manager-7c79b5df47-khncz\" (UID: \"38a95f1c-87ae-4464-b6fa-ad329d17290e\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.919203 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-cctsm" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.920307 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tmfm4" event={"ID":"107ead80-d9cb-4fbb-bb14-dbdb54912a2f","Type":"ContainerDied","Data":"a3fccc5cbeb8c2c3f72d861bd80ce3b3571903196a233548a852651758b1b45f"} Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.920411 4841 scope.go:117] "RemoveContainer" containerID="01b7ebc71bb91d0a18f85da87def58824199c8b68661a7d65ccefe5d0127ac69" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.920694 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tmfm4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.922346 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t4v8\" (UniqueName: \"kubernetes.io/projected/70e46d25-a5c6-49b4-b3d5-0828bc234644-kube-api-access-7t4v8\") pod \"swift-operator-controller-manager-5f8c65bbfc-km89l\" (UID: \"70e46d25-a5c6-49b4-b3d5-0828bc234644\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.927996 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.935973 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.946782 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.961060 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkvvw\" (UniqueName: \"kubernetes.io/projected/1348de54-9137-400b-b3db-b684d9a03dc4-kube-api-access-vkvvw\") pod \"telemetry-operator-controller-manager-65c59f5d56-72jtb\" (UID: \"1348de54-9137-400b-b3db-b684d9a03dc4\") " pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.975263 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.984538 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld"] Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.987666 4841 scope.go:117] "RemoveContainer" containerID="6094f5e4674be0094ecaa2753b36722ba36afd2461b023eb2824604f13260da7" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.990351 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.992974 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mvmsn" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.994366 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.994516 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 17:15:43 crc kubenswrapper[4841]: I1203 17:15:43.996463 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.009645 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.011887 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.011988 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.014182 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.018112 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.019964 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.031226 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-mflb2" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.032009 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.049243 4841 scope.go:117] "RemoveContainer" containerID="9ada7b5596ebdf7d0e22d7892d456c614845aedd0a04b25ff0bb1bd59be3ea37" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.063288 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069299 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftwkv\" (UniqueName: \"kubernetes.io/projected/8a13d35a-d714-4a7f-922b-a6d3a0b580c3-kube-api-access-ftwkv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tv5cl\" (UID: \"8a13d35a-d714-4a7f-922b-a6d3a0b580c3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87m6c\" (UniqueName: \"kubernetes.io/projected/3448d609-0836-4562-ac6b-03d353471880-kube-api-access-87m6c\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069477 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069576 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmj9v\" (UniqueName: \"kubernetes.io/projected/fe8b56bd-b492-48cf-a3f2-621b4f58d29c-kube-api-access-vmj9v\") pod \"watcher-operator-controller-manager-769dc69bc-r9rbr\" (UID: \"fe8b56bd-b492-48cf-a3f2-621b4f58d29c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgjbv\" (UniqueName: \"kubernetes.io/projected/56b976ca-c419-42f4-b063-c0219f4e0a72-kube-api-access-hgjbv\") pod \"test-operator-controller-manager-5854674fcc-fnjsz\" (UID: \"56b976ca-c419-42f4-b063-c0219f4e0a72\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.069625 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.069670 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert podName:b9bdf600-ace4-4f28-80c9-3dd36cf449ad nodeName:}" failed. No retries permitted until 2025-12-03 17:15:45.069655539 +0000 UTC m=+939.457176266 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert") pod "infra-operator-controller-manager-57548d458d-s2b4r" (UID: "b9bdf600-ace4-4f28-80c9-3dd36cf449ad") : secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069689 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069737 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c87w\" (UniqueName: \"kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.069766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkvvw\" (UniqueName: \"kubernetes.io/projected/1348de54-9137-400b-b3db-b684d9a03dc4-kube-api-access-vkvvw\") pod \"telemetry-operator-controller-manager-65c59f5d56-72jtb\" (UID: \"1348de54-9137-400b-b3db-b684d9a03dc4\") " pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.078698 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.089886 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.093939 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tmfm4"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.099787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkvvw\" (UniqueName: \"kubernetes.io/projected/1348de54-9137-400b-b3db-b684d9a03dc4-kube-api-access-vkvvw\") pod \"telemetry-operator-controller-manager-65c59f5d56-72jtb\" (UID: \"1348de54-9137-400b-b3db-b684d9a03dc4\") " pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.156753 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.164793 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171061 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171155 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftwkv\" (UniqueName: \"kubernetes.io/projected/8a13d35a-d714-4a7f-922b-a6d3a0b580c3-kube-api-access-ftwkv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tv5cl\" (UID: \"8a13d35a-d714-4a7f-922b-a6d3a0b580c3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171178 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87m6c\" (UniqueName: \"kubernetes.io/projected/3448d609-0836-4562-ac6b-03d353471880-kube-api-access-87m6c\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171202 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171255 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmj9v\" (UniqueName: \"kubernetes.io/projected/fe8b56bd-b492-48cf-a3f2-621b4f58d29c-kube-api-access-vmj9v\") pod \"watcher-operator-controller-manager-769dc69bc-r9rbr\" (UID: \"fe8b56bd-b492-48cf-a3f2-621b4f58d29c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171276 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgjbv\" (UniqueName: \"kubernetes.io/projected/56b976ca-c419-42f4-b063-c0219f4e0a72-kube-api-access-hgjbv\") pod \"test-operator-controller-manager-5854674fcc-fnjsz\" (UID: \"56b976ca-c419-42f4-b063-c0219f4e0a72\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171385 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.171867 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.171957 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:44.671939328 +0000 UTC m=+939.059460055 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.172053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.172169 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.172229 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:44.672219714 +0000 UTC m=+939.059740441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "metrics-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.172288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.171411 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.178799 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c87w\" (UniqueName: \"kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.190406 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftwkv\" (UniqueName: \"kubernetes.io/projected/8a13d35a-d714-4a7f-922b-a6d3a0b580c3-kube-api-access-ftwkv\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tv5cl\" (UID: \"8a13d35a-d714-4a7f-922b-a6d3a0b580c3\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.192637 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.194834 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgjbv\" (UniqueName: \"kubernetes.io/projected/56b976ca-c419-42f4-b063-c0219f4e0a72-kube-api-access-hgjbv\") pod \"test-operator-controller-manager-5854674fcc-fnjsz\" (UID: \"56b976ca-c419-42f4-b063-c0219f4e0a72\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.196029 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87m6c\" (UniqueName: \"kubernetes.io/projected/3448d609-0836-4562-ac6b-03d353471880-kube-api-access-87m6c\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.197651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmj9v\" (UniqueName: \"kubernetes.io/projected/fe8b56bd-b492-48cf-a3f2-621b4f58d29c-kube-api-access-vmj9v\") pod \"watcher-operator-controller-manager-769dc69bc-r9rbr\" (UID: \"fe8b56bd-b492-48cf-a3f2-621b4f58d29c\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.199991 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c87w\" (UniqueName: \"kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w\") pod \"certified-operators-8vhqc\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: W1203 17:15:44.213800 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ef72b8_96de_4545_9100_081f42138dff.slice/crio-d5b289879bc57273dee2ca96527b2ff31f8957658edc1aea320f8d45ece72dd5 WatchSource:0}: Error finding container d5b289879bc57273dee2ca96527b2ff31f8957658edc1aea320f8d45ece72dd5: Status 404 returned error can't find the container with id d5b289879bc57273dee2ca96527b2ff31f8957658edc1aea320f8d45ece72dd5 Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.236778 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.259207 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107ead80-d9cb-4fbb-bb14-dbdb54912a2f" path="/var/lib/kubelet/pods/107ead80-d9cb-4fbb-bb14-dbdb54912a2f/volumes" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.262654 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.265880 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.281939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.283872 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.284022 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert podName:7e01626f-e7f3-4c48-bc9b-5d9261b3d89a nodeName:}" failed. No retries permitted until 2025-12-03 17:15:45.283997085 +0000 UTC m=+939.671517812 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" (UID: "7e01626f-e7f3-4c48-bc9b-5d9261b3d89a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.418382 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.426314 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.437056 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.507416 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.539389 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.553800 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.568737 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5"] Dec 03 17:15:44 crc kubenswrapper[4841]: W1203 17:15:44.625128 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc969bc4d_df07_4ec7_b406_7de0710faca8.slice/crio-fee0460ba73f835fa4b44369b40121ec24486a55ff912665bb2297f10e59fcdd WatchSource:0}: Error finding container fee0460ba73f835fa4b44369b40121ec24486a55ff912665bb2297f10e59fcdd: Status 404 returned error can't find the container with id fee0460ba73f835fa4b44369b40121ec24486a55ff912665bb2297f10e59fcdd Dec 03 17:15:44 crc kubenswrapper[4841]: W1203 17:15:44.663674 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5ecd72c_3074_4870_bbe6_f6bfbbe5d5f8.slice/crio-da34857f48f842503957e48bc0ac480966562559562285ccea29b51530765f82 WatchSource:0}: Error finding container da34857f48f842503957e48bc0ac480966562559562285ccea29b51530765f82: Status 404 returned error can't find the container with id da34857f48f842503957e48bc0ac480966562559562285ccea29b51530765f82 Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.690387 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.690635 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.690660 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.690718 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.690736 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:45.690713781 +0000 UTC m=+940.078234578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: E1203 17:15:44.690775 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:45.690758512 +0000 UTC m=+940.078279269 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "metrics-server-cert" not found Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.740142 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz"] Dec 03 17:15:44 crc kubenswrapper[4841]: W1203 17:15:44.765198 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38a95f1c_87ae_4464_b6fa_ad329d17290e.slice/crio-230b6a329923a727249ea15770d91877aa5d5d5b1f8ff989bbb5fd8b967715ac WatchSource:0}: Error finding container 230b6a329923a727249ea15770d91877aa5d5d5b1f8ff989bbb5fd8b967715ac: Status 404 returned error can't find the container with id 230b6a329923a727249ea15770d91877aa5d5d5b1f8ff989bbb5fd8b967715ac Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.783606 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.792631 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.951716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" event={"ID":"c969bc4d-df07-4ec7-b406-7de0710faca8","Type":"ContainerStarted","Data":"fee0460ba73f835fa4b44369b40121ec24486a55ff912665bb2297f10e59fcdd"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.959271 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" event={"ID":"6384ded0-4512-4d89-bef4-004339bb019d","Type":"ContainerStarted","Data":"bbf47fb75950b174d2101d1350597b6b2439925f9a86075fa8646dca3717af99"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.968343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" event={"ID":"6dbdda39-de04-49e2-8667-58eb77b076b9","Type":"ContainerStarted","Data":"e8598460dfad4005a95d7772db91be8be74e5dca7acf3584692134c0a56a4fb5"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.974402 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" event={"ID":"cae5c7a3-2395-4cfe-93f2-5a7301c52444","Type":"ContainerStarted","Data":"637690ef26d49c542fa2be9c6280dd48d4f0d85dbe7de5592cc0c70257029373"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.976230 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.976321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" event={"ID":"096189c4-aa40-4a3d-b8df-f8dbfa674e08","Type":"ContainerStarted","Data":"ca874c97072dab15430bd40b6def96e4661b011f222379e293cd53b2e97d391e"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.986757 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" event={"ID":"abd88bfa-5c17-4486-a051-50c1ceaafe60","Type":"ContainerStarted","Data":"e65c42371859103a55793db7090bf4b912be826efb60ea353190bd9b9c91d34d"} Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.991368 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-l56t4"] Dec 03 17:15:44 crc kubenswrapper[4841]: I1203 17:15:44.997775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" event={"ID":"38a95f1c-87ae-4464-b6fa-ad329d17290e","Type":"ContainerStarted","Data":"230b6a329923a727249ea15770d91877aa5d5d5b1f8ff989bbb5fd8b967715ac"} Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.009459 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm"] Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.012083 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e46d25_a5c6_49b4_b3d5_0828bc234644.slice/crio-4cc1a4b82725266300a8b658ae22c283de9c653c6d2a309f2f1bc0e036718d93 WatchSource:0}: Error finding container 4cc1a4b82725266300a8b658ae22c283de9c653c6d2a309f2f1bc0e036718d93: Status 404 returned error can't find the container with id 4cc1a4b82725266300a8b658ae22c283de9c653c6d2a309f2f1bc0e036718d93 Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.013093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" event={"ID":"0dda1581-f45b-42cd-840f-9b8f2d7a48b1","Type":"ContainerStarted","Data":"f3294e7081bf1ff38018cdd10601882c3428f8540ee84bf86104e2017d8cf94a"} Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.017746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" event={"ID":"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8","Type":"ContainerStarted","Data":"da34857f48f842503957e48bc0ac480966562559562285ccea29b51530765f82"} Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.020190 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" event={"ID":"a6ef72b8-96de-4545-9100-081f42138dff","Type":"ContainerStarted","Data":"d5b289879bc57273dee2ca96527b2ff31f8957658edc1aea320f8d45ece72dd5"} Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.040205 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda253f1cc_d669_490e_9bf4_aff2e95347b0.slice/crio-a8bf53f5aa50cbf69e8d4bdd086b98b9e7cca66357f84695cc0ce8543ad403cd WatchSource:0}: Error finding container a8bf53f5aa50cbf69e8d4bdd086b98b9e7cca66357f84695cc0ce8543ad403cd: Status 404 returned error can't find the container with id a8bf53f5aa50cbf69e8d4bdd086b98b9e7cca66357f84695cc0ce8543ad403cd Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.105815 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.106021 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.106100 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert podName:b9bdf600-ace4-4f28-80c9-3dd36cf449ad nodeName:}" failed. No retries permitted until 2025-12-03 17:15:47.10607605 +0000 UTC m=+941.493596777 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert") pod "infra-operator-controller-manager-57548d458d-s2b4r" (UID: "b9bdf600-ace4-4f28-80c9-3dd36cf449ad") : secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.136596 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6"] Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.308608 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.308819 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.308881 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert podName:7e01626f-e7f3-4c48-bc9b-5d9261b3d89a nodeName:}" failed. No retries permitted until 2025-12-03 17:15:47.308864106 +0000 UTC m=+941.696384833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" (UID: "7e01626f-e7f3-4c48-bc9b-5d9261b3d89a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.335359 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-q9b79"] Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.350471 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl"] Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.357886 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36b8b756_d0a6_4db5_a33d_dcf5b9e77bbb.slice/crio-40c275f358ab01bac49d35be4e8cf429ca958982f5716e00156cf6a50915e3db WatchSource:0}: Error finding container 40c275f358ab01bac49d35be4e8cf429ca958982f5716e00156cf6a50915e3db: Status 404 returned error can't find the container with id 40c275f358ab01bac49d35be4e8cf429ca958982f5716e00156cf6a50915e3db Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.377540 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr"] Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.382796 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmk7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-q9b79_openstack-operators(a4eccc19-eb01-4b44-99e0-041144e4b409): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.391810 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fmk7v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-q9b79_openstack-operators(a4eccc19-eb01-4b44-99e0-041144e4b409): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.393293 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" podUID="a4eccc19-eb01-4b44-99e0-041144e4b409" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.398842 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vmj9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-r9rbr_openstack-operators(fe8b56bd-b492-48cf-a3f2-621b4f58d29c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.400917 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vmj9v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-r9rbr_openstack-operators(fe8b56bd-b492-48cf-a3f2-621b4f58d29c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.401988 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" podUID="fe8b56bd-b492-48cf-a3f2-621b4f58d29c" Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.403222 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb"] Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.419289 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1348de54_9137_400b_b3db_b684d9a03dc4.slice/crio-89dc40f5167fee572aa8b7d83074fe1b802676e584656ace43dd46f5a88eba17 WatchSource:0}: Error finding container 89dc40f5167fee572aa8b7d83074fe1b802676e584656ace43dd46f5a88eba17: Status 404 returned error can't find the container with id 89dc40f5167fee572aa8b7d83074fe1b802676e584656ace43dd46f5a88eba17 Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.421244 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.138:5001/openstack-k8s-operators/telemetry-operator:8758995fb297e7c94a1755f70b853b0980addb81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vkvvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-65c59f5d56-72jtb_openstack-operators(1348de54-9137-400b-b3db-b684d9a03dc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.423183 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vkvvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-65c59f5d56-72jtb_openstack-operators(1348de54-9137-400b-b3db-b684d9a03dc4): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.424359 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" podUID="1348de54-9137-400b-b3db-b684d9a03dc4" Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.479270 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.489367 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9cc4d57_7fab_4a4e_9a5d_b334918f4c49.slice/crio-fed136fb6f121cf45ff305f41935596ab8b3a8a49e4f032c3429c8e86f161ee3 WatchSource:0}: Error finding container fed136fb6f121cf45ff305f41935596ab8b3a8a49e4f032c3429c8e86f161ee3: Status 404 returned error can't find the container with id fed136fb6f121cf45ff305f41935596ab8b3a8a49e4f032c3429c8e86f161ee3 Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.495734 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz"] Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.502996 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl"] Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.504787 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56b976ca_c419_42f4_b063_c0219f4e0a72.slice/crio-9b963c3231758008596465231b245f3527499991da70cdb48dc925e4d3a8fc7a WatchSource:0}: Error finding container 9b963c3231758008596465231b245f3527499991da70cdb48dc925e4d3a8fc7a: Status 404 returned error can't find the container with id 9b963c3231758008596465231b245f3527499991da70cdb48dc925e4d3a8fc7a Dec 03 17:15:45 crc kubenswrapper[4841]: W1203 17:15:45.517211 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a13d35a_d714_4a7f_922b_a6d3a0b580c3.slice/crio-d22b7f4254d0be6ebb516b33acfcc881557a1547e0f439444645d8726d9d1827 WatchSource:0}: Error finding container d22b7f4254d0be6ebb516b33acfcc881557a1547e0f439444645d8726d9d1827: Status 404 returned error can't find the container with id d22b7f4254d0be6ebb516b33acfcc881557a1547e0f439444645d8726d9d1827 Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.519544 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ftwkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-tv5cl_openstack-operators(8a13d35a-d714-4a7f-922b-a6d3a0b580c3): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.520720 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" podUID="8a13d35a-d714-4a7f-922b-a6d3a0b580c3" Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.714249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:45 crc kubenswrapper[4841]: I1203 17:15:45.714647 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.714439 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.714743 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:47.714722753 +0000 UTC m=+942.102243480 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.714817 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 17:15:45 crc kubenswrapper[4841]: E1203 17:15:45.714869 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:47.714853096 +0000 UTC m=+942.102373823 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "metrics-server-cert" not found Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.038347 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" event={"ID":"b24334e0-1dd6-4667-8ce1-6013cc71dd7f","Type":"ContainerStarted","Data":"528e417c994ee131959e98f926736997bb6101bf069c47e1afcad2a3eadf9a15"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.041504 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerID="7dbdbd4e2177da28b59066fa9f42763601f33d01adf99fbfa9b64037c7376ac9" exitCode=0 Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.041569 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerDied","Data":"7dbdbd4e2177da28b59066fa9f42763601f33d01adf99fbfa9b64037c7376ac9"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.041586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerStarted","Data":"fed136fb6f121cf45ff305f41935596ab8b3a8a49e4f032c3429c8e86f161ee3"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.045698 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" event={"ID":"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb","Type":"ContainerStarted","Data":"40c275f358ab01bac49d35be4e8cf429ca958982f5716e00156cf6a50915e3db"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.051363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" event={"ID":"a253f1cc-d669-490e-9bf4-aff2e95347b0","Type":"ContainerStarted","Data":"a8bf53f5aa50cbf69e8d4bdd086b98b9e7cca66357f84695cc0ce8543ad403cd"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.053538 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" event={"ID":"fe8b56bd-b492-48cf-a3f2-621b4f58d29c","Type":"ContainerStarted","Data":"7dac5b8171b2719014f0b32ae1cbfcd233f19f9b48b495f1f64aebcf7f32fcd2"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.058243 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" event={"ID":"56b976ca-c419-42f4-b063-c0219f4e0a72","Type":"ContainerStarted","Data":"9b963c3231758008596465231b245f3527499991da70cdb48dc925e4d3a8fc7a"} Dec 03 17:15:46 crc kubenswrapper[4841]: E1203 17:15:46.059244 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" podUID="fe8b56bd-b492-48cf-a3f2-621b4f58d29c" Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.069034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" event={"ID":"1348de54-9137-400b-b3db-b684d9a03dc4","Type":"ContainerStarted","Data":"89dc40f5167fee572aa8b7d83074fe1b802676e584656ace43dd46f5a88eba17"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.076191 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" event={"ID":"7b153ea5-5794-46c6-a3f3-099b3b45dfef","Type":"ContainerStarted","Data":"fdd73a68cc642871dcb680d5c82d4d52b694d92829439f0a871e9929226ffc32"} Dec 03 17:15:46 crc kubenswrapper[4841]: E1203 17:15:46.078136 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.138:5001/openstack-k8s-operators/telemetry-operator:8758995fb297e7c94a1755f70b853b0980addb81\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" podUID="1348de54-9137-400b-b3db-b684d9a03dc4" Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.081290 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" event={"ID":"70e46d25-a5c6-49b4-b3d5-0828bc234644","Type":"ContainerStarted","Data":"4cc1a4b82725266300a8b658ae22c283de9c653c6d2a309f2f1bc0e036718d93"} Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.091794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" event={"ID":"8a13d35a-d714-4a7f-922b-a6d3a0b580c3","Type":"ContainerStarted","Data":"d22b7f4254d0be6ebb516b33acfcc881557a1547e0f439444645d8726d9d1827"} Dec 03 17:15:46 crc kubenswrapper[4841]: E1203 17:15:46.106770 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" podUID="8a13d35a-d714-4a7f-922b-a6d3a0b580c3" Dec 03 17:15:46 crc kubenswrapper[4841]: I1203 17:15:46.143301 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" event={"ID":"a4eccc19-eb01-4b44-99e0-041144e4b409","Type":"ContainerStarted","Data":"4b386c2ee75f06fc348ed93e25c217cf149825075619be5e70435f3b0da9a832"} Dec 03 17:15:46 crc kubenswrapper[4841]: E1203 17:15:46.157156 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" podUID="a4eccc19-eb01-4b44-99e0-041144e4b409" Dec 03 17:15:47 crc kubenswrapper[4841]: I1203 17:15:47.139349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.140077 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.140165 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert podName:b9bdf600-ace4-4f28-80c9-3dd36cf449ad nodeName:}" failed. No retries permitted until 2025-12-03 17:15:51.140134088 +0000 UTC m=+945.527654815 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert") pod "infra-operator-controller-manager-57548d458d-s2b4r" (UID: "b9bdf600-ace4-4f28-80c9-3dd36cf449ad") : secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: I1203 17:15:47.345077 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.345333 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.345397 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert podName:7e01626f-e7f3-4c48-bc9b-5d9261b3d89a nodeName:}" failed. No retries permitted until 2025-12-03 17:15:51.345380811 +0000 UTC m=+945.732901538 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" (UID: "7e01626f-e7f3-4c48-bc9b-5d9261b3d89a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: I1203 17:15:47.750878 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:47 crc kubenswrapper[4841]: I1203 17:15:47.750960 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.751071 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.751062 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.751120 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:51.751104356 +0000 UTC m=+946.138625083 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "metrics-server-cert" not found Dec 03 17:15:47 crc kubenswrapper[4841]: E1203 17:15:47.751160 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:51.751137256 +0000 UTC m=+946.138658053 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:48 crc kubenswrapper[4841]: E1203 17:15:48.497486 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" podUID="8a13d35a-d714-4a7f-922b-a6d3a0b580c3" Dec 03 17:15:48 crc kubenswrapper[4841]: E1203 17:15:48.497701 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.138:5001/openstack-k8s-operators/telemetry-operator:8758995fb297e7c94a1755f70b853b0980addb81\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" podUID="1348de54-9137-400b-b3db-b684d9a03dc4" Dec 03 17:15:48 crc kubenswrapper[4841]: E1203 17:15:48.505787 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" podUID="fe8b56bd-b492-48cf-a3f2-621b4f58d29c" Dec 03 17:15:48 crc kubenswrapper[4841]: E1203 17:15:48.505790 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" podUID="a4eccc19-eb01-4b44-99e0-041144e4b409" Dec 03 17:15:51 crc kubenswrapper[4841]: I1203 17:15:51.202007 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.202210 4841 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.202602 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert podName:b9bdf600-ace4-4f28-80c9-3dd36cf449ad nodeName:}" failed. No retries permitted until 2025-12-03 17:15:59.202582662 +0000 UTC m=+953.590103389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert") pod "infra-operator-controller-manager-57548d458d-s2b4r" (UID: "b9bdf600-ace4-4f28-80c9-3dd36cf449ad") : secret "infra-operator-webhook-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: I1203 17:15:51.406171 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.406366 4841 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.406412 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert podName:7e01626f-e7f3-4c48-bc9b-5d9261b3d89a nodeName:}" failed. No retries permitted until 2025-12-03 17:15:59.406396791 +0000 UTC m=+953.793917518 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" (UID: "7e01626f-e7f3-4c48-bc9b-5d9261b3d89a") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: I1203 17:15:51.811493 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:51 crc kubenswrapper[4841]: I1203 17:15:51.811592 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.811735 4841 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.811791 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:59.811773286 +0000 UTC m=+954.199294013 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "metrics-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.812219 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:51 crc kubenswrapper[4841]: E1203 17:15:51.812272 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:15:59.812262297 +0000 UTC m=+954.199783034 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:58 crc kubenswrapper[4841]: E1203 17:15:58.738405 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59" Dec 03 17:15:58 crc kubenswrapper[4841]: E1203 17:15:58.738974 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:635a4aef9d6f0b799e8ec91333dbb312160c001d05b3c63f614c124e0b67cb59,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk7f4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2d7g6_openstack-operators(7b153ea5-5794-46c6-a3f3-099b3b45dfef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.281983 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.292650 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b9bdf600-ace4-4f28-80c9-3dd36cf449ad-cert\") pod \"infra-operator-controller-manager-57548d458d-s2b4r\" (UID: \"b9bdf600-ace4-4f28-80c9-3dd36cf449ad\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.465758 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.485308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.488966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7e01626f-e7f3-4c48-bc9b-5d9261b3d89a-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c\" (UID: \"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:59 crc kubenswrapper[4841]: E1203 17:15:59.510790 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809" Dec 03 17:15:59 crc kubenswrapper[4841]: E1203 17:15:59.511007 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:abdb733b01e92ac17f565762f30f1d075b44c16421bd06e557f6bb3c319e1809,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kpjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-jmzr6_openstack-operators(c969bc4d-df07-4ec7-b406-7de0710faca8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.649227 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.891070 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.891179 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:15:59 crc kubenswrapper[4841]: E1203 17:15:59.891321 4841 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 17:15:59 crc kubenswrapper[4841]: E1203 17:15:59.891402 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs podName:3448d609-0836-4562-ac6b-03d353471880 nodeName:}" failed. No retries permitted until 2025-12-03 17:16:15.891385145 +0000 UTC m=+970.278905872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs") pod "openstack-operator-controller-manager-c7f74d46b-4txld" (UID: "3448d609-0836-4562-ac6b-03d353471880") : secret "webhook-server-cert" not found Dec 03 17:15:59 crc kubenswrapper[4841]: I1203 17:15:59.898188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-metrics-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:00 crc kubenswrapper[4841]: E1203 17:16:00.870552 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5" Dec 03 17:16:00 crc kubenswrapper[4841]: E1203 17:16:00.871023 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:9e847f4dbdea19ab997f32a02b3680a9bd966f9c705911645c3866a19fda9ea5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8htq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-8jcwp_openstack-operators(096189c4-aa40-4a3d-b8df-f8dbfa674e08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:01 crc kubenswrapper[4841]: E1203 17:16:01.987733 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530" Dec 03 17:16:01 crc kubenswrapper[4841]: E1203 17:16:01.987929 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:0f523b7e2fa9e86fef986acf07d0c42d5658c475d565f11eaea926ebffcb6530,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9qgnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-jfddn_openstack-operators(6384ded0-4512-4d89-bef4-004339bb019d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:02 crc kubenswrapper[4841]: E1203 17:16:02.564316 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea" Dec 03 17:16:02 crc kubenswrapper[4841]: E1203 17:16:02.564830 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f6059a0fbf031d34dcf086d14ce8c0546caeaee23c5780e90b5037c5feee9fea,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kr8lx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-hwp8p_openstack-operators(a6ef72b8-96de-4545-9100-081f42138dff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:03 crc kubenswrapper[4841]: E1203 17:16:03.419670 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9" Dec 03 17:16:03 crc kubenswrapper[4841]: E1203 17:16:03.419940 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:2e59cfbeefc3aff0bb0a6ae9ce2235129f5173c98dd5ee8dac229ad4895faea9,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jsr8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-khncz_openstack-operators(38a95f1c-87ae-4464-b6fa-ad329d17290e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:04 crc kubenswrapper[4841]: E1203 17:16:04.443837 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 03 17:16:04 crc kubenswrapper[4841]: E1203 17:16:04.444211 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58clf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-z8zb5_openstack-operators(cae5c7a3-2395-4cfe-93f2-5a7301c52444): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:05 crc kubenswrapper[4841]: E1203 17:16:05.054979 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 17:16:05 crc kubenswrapper[4841]: E1203 17:16:05.055165 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k5t5j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-fj8w7_openstack-operators(0dda1581-f45b-42cd-840f-9b8f2d7a48b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:06 crc kubenswrapper[4841]: E1203 17:16:06.255104 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 17:16:06 crc kubenswrapper[4841]: E1203 17:16:06.255514 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kw6dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-sl5jm_openstack-operators(a253f1cc-d669-490e-9bf4-aff2e95347b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:06 crc kubenswrapper[4841]: I1203 17:16:06.658529 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r"] Dec 03 17:16:06 crc kubenswrapper[4841]: W1203 17:16:06.850883 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9bdf600_ace4_4f28_80c9_3dd36cf449ad.slice/crio-2d5f3b735c6b39676ab8d95b451333275e11b97e149470ce3b66663efac36af8 WatchSource:0}: Error finding container 2d5f3b735c6b39676ab8d95b451333275e11b97e149470ce3b66663efac36af8: Status 404 returned error can't find the container with id 2d5f3b735c6b39676ab8d95b451333275e11b97e149470ce3b66663efac36af8 Dec 03 17:16:07 crc kubenswrapper[4841]: I1203 17:16:07.302195 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerID="97e8e6838d5139d4d6b13fd86fb618817aafe29f1ee1b80fa6b5bd7ac24681ae" exitCode=0 Dec 03 17:16:07 crc kubenswrapper[4841]: I1203 17:16:07.302275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerDied","Data":"97e8e6838d5139d4d6b13fd86fb618817aafe29f1ee1b80fa6b5bd7ac24681ae"} Dec 03 17:16:07 crc kubenswrapper[4841]: I1203 17:16:07.304215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" event={"ID":"b9bdf600-ace4-4f28-80c9-3dd36cf449ad","Type":"ContainerStarted","Data":"2d5f3b735c6b39676ab8d95b451333275e11b97e149470ce3b66663efac36af8"} Dec 03 17:16:09 crc kubenswrapper[4841]: I1203 17:16:09.316957 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:16:09 crc kubenswrapper[4841]: I1203 17:16:09.317269 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:16:10 crc kubenswrapper[4841]: I1203 17:16:10.361530 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c"] Dec 03 17:16:11 crc kubenswrapper[4841]: W1203 17:16:11.055075 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e01626f_e7f3_4c48_bc9b_5d9261b3d89a.slice/crio-26ed98ca92f21654154d595d53e11bb4d6d2f86cc48e35d46d2826a9adad2611 WatchSource:0}: Error finding container 26ed98ca92f21654154d595d53e11bb4d6d2f86cc48e35d46d2826a9adad2611: Status 404 returned error can't find the container with id 26ed98ca92f21654154d595d53e11bb4d6d2f86cc48e35d46d2826a9adad2611 Dec 03 17:16:11 crc kubenswrapper[4841]: I1203 17:16:11.346461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" event={"ID":"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a","Type":"ContainerStarted","Data":"26ed98ca92f21654154d595d53e11bb4d6d2f86cc48e35d46d2826a9adad2611"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.368802 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" event={"ID":"56b976ca-c419-42f4-b063-c0219f4e0a72","Type":"ContainerStarted","Data":"494e848ac349683fb6acc497130725abbdf181dcb6e60e63f6e88a4099e94ec8"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.372167 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" event={"ID":"abd88bfa-5c17-4486-a051-50c1ceaafe60","Type":"ContainerStarted","Data":"41715f43c7c374cc230d671ade6f3224ba3c52883c982d2e182e79cdb7e440b4"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.378388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" event={"ID":"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb","Type":"ContainerStarted","Data":"c6ceea0d7c5c637df2245a4149e517c9e19b8ff99e60fb52725a34f4b2a06d41"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.381539 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" event={"ID":"70e46d25-a5c6-49b4-b3d5-0828bc234644","Type":"ContainerStarted","Data":"e29d5e067dd7be211f88f425be625af1566b1e119ec3f5949e6e96e817c44939"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.390639 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" event={"ID":"6dbdda39-de04-49e2-8667-58eb77b076b9","Type":"ContainerStarted","Data":"591c4225d2a1f493b80322ffaa2a20b1b776c67cf25fab437a18f854f53dbbf4"} Dec 03 17:16:12 crc kubenswrapper[4841]: I1203 17:16:12.393197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" event={"ID":"b24334e0-1dd6-4667-8ce1-6013cc71dd7f","Type":"ContainerStarted","Data":"c15e9a2f8e4e75753d5fcc15d775adfb757bc930bb9e6fcb5d3247d28696d829"} Dec 03 17:16:13 crc kubenswrapper[4841]: I1203 17:16:13.401491 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" event={"ID":"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8","Type":"ContainerStarted","Data":"03d8fea77aa7f30155c189d1ef1980d9b256025e8c36234105655ae003899931"} Dec 03 17:16:13 crc kubenswrapper[4841]: I1203 17:16:13.403474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" event={"ID":"fe8b56bd-b492-48cf-a3f2-621b4f58d29c","Type":"ContainerStarted","Data":"4928cbb16898739c9a5f02aec9662223e1c0488aec16bf1e94b826203dcd6f69"} Dec 03 17:16:13 crc kubenswrapper[4841]: I1203 17:16:13.406184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerStarted","Data":"e0f3b7d98e35f369a465586fd1765b818c279a275738035fe282ac85c46ebcee"} Dec 03 17:16:13 crc kubenswrapper[4841]: I1203 17:16:13.427431 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8vhqc" podStartSLOduration=4.250601807 podStartE2EDuration="30.427407096s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:46.050567826 +0000 UTC m=+940.438088553" lastFinishedPulling="2025-12-03 17:16:12.227373115 +0000 UTC m=+966.614893842" observedRunningTime="2025-12-03 17:16:13.421898067 +0000 UTC m=+967.809418794" watchObservedRunningTime="2025-12-03 17:16:13.427407096 +0000 UTC m=+967.814927823" Dec 03 17:16:14 crc kubenswrapper[4841]: I1203 17:16:14.419224 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:14 crc kubenswrapper[4841]: I1203 17:16:14.419578 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:15 crc kubenswrapper[4841]: I1203 17:16:15.461556 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-8vhqc" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="registry-server" probeResult="failure" output=< Dec 03 17:16:15 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:16:15 crc kubenswrapper[4841]: > Dec 03 17:16:15 crc kubenswrapper[4841]: I1203 17:16:15.965768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:15 crc kubenswrapper[4841]: I1203 17:16:15.976944 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/3448d609-0836-4562-ac6b-03d353471880-webhook-certs\") pod \"openstack-operator-controller-manager-c7f74d46b-4txld\" (UID: \"3448d609-0836-4562-ac6b-03d353471880\") " pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:16 crc kubenswrapper[4841]: I1203 17:16:16.186065 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-mvmsn" Dec 03 17:16:16 crc kubenswrapper[4841]: I1203 17:16:16.195099 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.549868 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.552975 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.558302 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.584186 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgkt2\" (UniqueName: \"kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.584313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.584360 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.685820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.685868 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.685949 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgkt2\" (UniqueName: \"kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.686670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.686890 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.708119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgkt2\" (UniqueName: \"kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2\") pod \"redhat-marketplace-z9pgf\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:22 crc kubenswrapper[4841]: I1203 17:16:22.893100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.022436 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.022623 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hgjbv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-fnjsz_openstack-operators(56b976ca-c419-42f4-b063-c0219f4e0a72): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.022684 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.022862 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c5kxh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-qgzs7_openstack-operators(6dbdda39-de04-49e2-8667-58eb77b076b9): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.023997 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" podUID="56b976ca-c419-42f4-b063-c0219f4e0a72" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.024111 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" podUID="6dbdda39-de04-49e2-8667-58eb77b076b9" Dec 03 17:16:23 crc kubenswrapper[4841]: I1203 17:16:23.468220 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.469172 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" podUID="56b976ca-c419-42f4-b063-c0219f4e0a72" Dec 03 17:16:23 crc kubenswrapper[4841]: E1203 17:16:23.469647 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" podUID="6dbdda39-de04-49e2-8667-58eb77b076b9" Dec 03 17:16:23 crc kubenswrapper[4841]: I1203 17:16:23.470124 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" Dec 03 17:16:24 crc kubenswrapper[4841]: I1203 17:16:24.456710 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:24 crc kubenswrapper[4841]: E1203 17:16:24.487551 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" podUID="56b976ca-c419-42f4-b063-c0219f4e0a72" Dec 03 17:16:24 crc kubenswrapper[4841]: I1203 17:16:24.502757 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:24 crc kubenswrapper[4841]: I1203 17:16:24.925959 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:16:25 crc kubenswrapper[4841]: I1203 17:16:25.492343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" event={"ID":"1348de54-9137-400b-b3db-b684d9a03dc4","Type":"ContainerStarted","Data":"95dc1f80336ad115a9fee83f55fe895bad53e608934936846934c4db6fcc9c4d"} Dec 03 17:16:25 crc kubenswrapper[4841]: I1203 17:16:25.494150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" event={"ID":"8a13d35a-d714-4a7f-922b-a6d3a0b580c3","Type":"ContainerStarted","Data":"9c1f456427fa95a26428db1abf7bf39a195036191d1b5d286ead7067d9807a0b"} Dec 03 17:16:25 crc kubenswrapper[4841]: I1203 17:16:25.495857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" event={"ID":"a4eccc19-eb01-4b44-99e0-041144e4b409","Type":"ContainerStarted","Data":"9fa4a28fa5653ff0689299936a19f4108336d2c77dbfbb5abf8e6ec4baa6c61d"} Dec 03 17:16:25 crc kubenswrapper[4841]: I1203 17:16:25.496177 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8vhqc" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="registry-server" containerID="cri-o://e0f3b7d98e35f369a465586fd1765b818c279a275738035fe282ac85c46ebcee" gracePeriod=2 Dec 03 17:16:25 crc kubenswrapper[4841]: E1203 17:16:25.497585 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" podUID="56b976ca-c419-42f4-b063-c0219f4e0a72" Dec 03 17:16:27 crc kubenswrapper[4841]: I1203 17:16:27.511389 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerID="e0f3b7d98e35f369a465586fd1765b818c279a275738035fe282ac85c46ebcee" exitCode=0 Dec 03 17:16:27 crc kubenswrapper[4841]: I1203 17:16:27.511479 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerDied","Data":"e0f3b7d98e35f369a465586fd1765b818c279a275738035fe282ac85c46ebcee"} Dec 03 17:16:27 crc kubenswrapper[4841]: I1203 17:16:27.532194 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tv5cl" podStartSLOduration=18.973327129 podStartE2EDuration="44.532169999s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.519413282 +0000 UTC m=+939.906934009" lastFinishedPulling="2025-12-03 17:16:11.078256152 +0000 UTC m=+965.465776879" observedRunningTime="2025-12-03 17:16:27.531059423 +0000 UTC m=+981.918580160" watchObservedRunningTime="2025-12-03 17:16:27.532169999 +0000 UTC m=+981.919690766" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.017114 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.017278 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8htq5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-68c6d99b8f-8jcwp_openstack-operators(096189c4-aa40-4a3d-b8df-f8dbfa674e08): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.018626 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" podUID="096189c4-aa40-4a3d-b8df-f8dbfa674e08" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.029616 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.029871 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jsr8h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-7c79b5df47-khncz_openstack-operators(38a95f1c-87ae-4464-b6fa-ad329d17290e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.033009 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" podUID="38a95f1c-87ae-4464-b6fa-ad329d17290e" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.034982 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.035234 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4kpjw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-77987cd8cd-jmzr6_openstack-operators(c969bc4d-df07-4ec7-b406-7de0710faca8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.036676 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" podUID="c969bc4d-df07-4ec7-b406-7de0710faca8" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.037435 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.037589 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dk7f4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-b6456fdb6-2d7g6_openstack-operators(7b153ea5-5794-46c6-a3f3-099b3b45dfef): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.037666 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.037732 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9qgnp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6c548fd776-jfddn_openstack-operators(6384ded0-4512-4d89-bef4-004339bb019d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.038886 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" podUID="6384ded0-4512-4d89-bef4-004339bb019d" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.038965 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" podUID="7b153ea5-5794-46c6-a3f3-099b3b45dfef" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.039768 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.039861 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kr8lx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7d9dfd778-hwp8p_openstack-operators(a6ef72b8-96de-4545-9100-081f42138dff): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.041016 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" podUID="a6ef72b8-96de-4545-9100-081f42138dff" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.042410 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.042510 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kw6dt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-sl5jm_openstack-operators(a253f1cc-d669-490e-9bf4-aff2e95347b0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:28 crc kubenswrapper[4841]: E1203 17:16:28.044396 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" podUID="a253f1cc-d669-490e-9bf4-aff2e95347b0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.175082 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.175661 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wsl4d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-l56t4_openstack-operators(b24334e0-1dd6-4667-8ce1-6013cc71dd7f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.176964 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" podUID="b24334e0-1dd6-4667-8ce1-6013cc71dd7f" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.211770 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.212008 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7t4v8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-5f8c65bbfc-km89l_openstack-operators(70e46d25-a5c6-49b4-b3d5-0828bc234644): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.211779 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.212332 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rlzj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-56bbcc9d85-t9pmr_openstack-operators(abd88bfa-5c17-4486-a051-50c1ceaafe60): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.213589 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" podUID="70e46d25-a5c6-49b4-b3d5-0828bc234644" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.218072 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" podUID="abd88bfa-5c17-4486-a051-50c1ceaafe60" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.234671 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying layer: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.234897 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-r6vg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-627sl_openstack-operators(36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb): ErrImagePull: rpc error: code = Canceled desc = copying layer: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.236164 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying layer: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" podUID="36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.530422 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" event={"ID":"b9bdf600-ace4-4f28-80c9-3dd36cf449ad","Type":"ContainerStarted","Data":"d46db8b079876f531e05fdb96a9bb2d08d8f6bd2071fe9f79e366b8f686d8b2b"} Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.531354 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.531376 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.531391 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.531403 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.533883 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.534069 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.534248 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.534301 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.544119 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.544307 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-58clf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-z8zb5_openstack-operators(cae5c7a3-2395-4cfe-93f2-5a7301c52444): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.545706 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"]" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" podUID="cae5c7a3-2395-4cfe-93f2-5a7301c52444" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.564347 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.572833 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81" Dec 03 17:16:29 crc kubenswrapper[4841]: E1203 17:16:29.573322 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-baremetal-operator-agent:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_ANSIBLEEE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-ansibleee-runner:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_EVALUATOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-evaluator:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_AODH_NOTIFIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-aodh-notifier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_APACHE_IMAGE_URL_DEFAULT,Value:registry.redhat.io/ubi9/httpd-24:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_KEYSTONE_LISTENER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-keystone-listener:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BARBICAN_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-barbican-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_IPMI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-ipmi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_MYSQLD_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/mysqld-exporter:v0.15.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_NOTIFICATION_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ceilometer-notification:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CEILOMETER_SGCORE_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_BACKUP_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-backup:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CINDER_VOLUME_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cinder-volume:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_API_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-api:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CLOUDKITTY_PROC_IMAGE_URL_DEFAULT,Value:quay.rdoproject.org/podified-master-centos10/openstack-cloudkitty-processor:current,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_BACKENDBIND9_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-backend-bind9:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_CENTRAL_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-central:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_MDNS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-mdns:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_PRODUCER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-producer:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_UNBOUND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-unbound:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_DESIGNATE_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-designate-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_FRR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-frr:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_ISCSID_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-iscsid:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_KEPLER_IMAGE_URL_DEFAULT,Value:quay.io/sustainable_computing_io/kepler:release-0.7.12,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_LOGROTATE_CROND_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-cron:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_MULTIPATHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-multipathd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_DHCP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_METADATA_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_OVN_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-ovn-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NEUTRON_SRIOV_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_NODE_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/prometheus/node-exporter:v1.5.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_OVN_BGP_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-bgp-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_EDPM_PODMAN_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/navidys/prometheus-podman-exporter:v1.10.1,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_GLANCE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-glance-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_CFNAPI_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-api-cfn:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HEAT_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_HORIZON_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_MEMCACHED_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_INFRA_REDIS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-redis:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_INSPECTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-inspector:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_NEUTRON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-neutron-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PXE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ironic-pxe:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_IRONIC_PYTHON_AGENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/ironic-python-agent:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KEYSTONE_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-keystone:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_KSM_IMAGE_URL_DEFAULT,Value:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MANILA_SHARE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-manila-share:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_MARIADB_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NET_UTILS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-netutils:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NEUTRON_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_COMPUTE_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_CONDUCTOR_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_NOVNC_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-novncproxy:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_NOVA_SCHEDULER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-nova-scheduler:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HEALTHMANAGER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-health-manager:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_HOUSEKEEPING_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-housekeeping:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_RSYSLOG_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OCTAVIA_WORKER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-octavia-worker:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_CLIENT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_MUST_GATHER_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-must-gather:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPENSTACK_NETWORK_EXPORTER_IMAGE_URL_DEFAULT,Value:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OS_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/edpm-hardened-uefi:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_CONTROLLER_OVS_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-base:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-nb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_NORTHD_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-northd:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OVN_SB_DBCLUSTER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-ovn-sb-db-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PLACEMENT_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-placement-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_RABBITMQ_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_ACCOUNT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-account:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_CONTAINER_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-container:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_OBJECT_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-object:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_SWIFT_PROXY_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-swift-proxy-server:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_TEST_TEMPEST_IMAGE_URL_DEFAULT,Value:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_API_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-api:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_APPLIER_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-applier:current-podified,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_WATCHER_DECISION_ENGINE_IMAGE_URL_DEFAULT,Value:quay.io/podified-master-centos9/openstack-watcher-decision-engine:current-podified,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cert,ReadOnly:true,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k4dl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c_openstack-operators(7e01626f-e7f3-4c48-bc9b-5d9261b3d89a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:16:29 crc kubenswrapper[4841]: W1203 17:16:29.635891 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod119786cb_9b22_466d_a3a2_ac9fd8465731.slice/crio-7ac0e3e6714135fbc22b7127ccc17a77f65a97f4dc9a12043725e51d5d2c799e WatchSource:0}: Error finding container 7ac0e3e6714135fbc22b7127ccc17a77f65a97f4dc9a12043725e51d5d2c799e: Status 404 returned error can't find the container with id 7ac0e3e6714135fbc22b7127ccc17a77f65a97f4dc9a12043725e51d5d2c799e Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.732511 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld"] Dec 03 17:16:29 crc kubenswrapper[4841]: W1203 17:16:29.795770 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3448d609_0836_4562_ac6b_03d353471880.slice/crio-cdae3933da0fe73ae124078494936f83723b00a6725533135b5d44fd8aac7e33 WatchSource:0}: Error finding container cdae3933da0fe73ae124078494936f83723b00a6725533135b5d44fd8aac7e33: Status 404 returned error can't find the container with id cdae3933da0fe73ae124078494936f83723b00a6725533135b5d44fd8aac7e33 Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.815336 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.882998 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities\") pod \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.884053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities" (OuterVolumeSpecName: "utilities") pod "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" (UID: "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.884211 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c87w\" (UniqueName: \"kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w\") pod \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.884975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content\") pod \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\" (UID: \"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49\") " Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.885241 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.889220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w" (OuterVolumeSpecName: "kube-api-access-6c87w") pod "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" (UID: "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49"). InnerVolumeSpecName "kube-api-access-6c87w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.939012 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" (UID: "a9cc4d57-7fab-4a4e-9a5d-b334918f4c49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.986339 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c87w\" (UniqueName: \"kubernetes.io/projected/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-kube-api-access-6c87w\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:29 crc kubenswrapper[4841]: I1203 17:16:29.986363 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:30 crc kubenswrapper[4841]: E1203 17:16:30.140103 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" podUID="7e01626f-e7f3-4c48-bc9b-5d9261b3d89a" Dec 03 17:16:30 crc kubenswrapper[4841]: E1203 17:16:30.218669 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" podUID="0dda1581-f45b-42cd-840f-9b8f2d7a48b1" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.535460 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" event={"ID":"70e46d25-a5c6-49b4-b3d5-0828bc234644","Type":"ContainerStarted","Data":"727132c83c356093bb110ba049cf91d8f9d2aca3da4caa4a5fd8b8848f240a66"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.537592 4841 generic.go:334] "Generic (PLEG): container finished" podID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerID="e8967e0176fa2bd56ee254141ebfccb2a6d04007592126cf3184944df247d36e" exitCode=0 Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.537667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerDied","Data":"e8967e0176fa2bd56ee254141ebfccb2a6d04007592126cf3184944df247d36e"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.537711 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerStarted","Data":"7ac0e3e6714135fbc22b7127ccc17a77f65a97f4dc9a12043725e51d5d2c799e"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.539431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" event={"ID":"fe8b56bd-b492-48cf-a3f2-621b4f58d29c","Type":"ContainerStarted","Data":"1d382d5bfec0ba81ad9a8faaf5243635a34ad7fb06f4af42002d109e014a247e"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.539646 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.540956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" event={"ID":"3448d609-0836-4562-ac6b-03d353471880","Type":"ContainerStarted","Data":"d3439b4398fcdddec317e20dc28f4a11da76aa6cc5ab459b7c17570ce12cc1db"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.540980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" event={"ID":"3448d609-0836-4562-ac6b-03d353471880","Type":"ContainerStarted","Data":"cdae3933da0fe73ae124078494936f83723b00a6725533135b5d44fd8aac7e33"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.541096 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.541806 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.545461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" event={"ID":"abd88bfa-5c17-4486-a051-50c1ceaafe60","Type":"ContainerStarted","Data":"84cfd68960c51fd5a3c54e5a246002ef074e9f5150c7d1ab740da5cb7a45c2e5"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.547544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" event={"ID":"36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb","Type":"ContainerStarted","Data":"0e76c2c5b8168a199437771317bef4064037e5702cb31d1ac9aae68d815c00e0"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.549170 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" event={"ID":"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a","Type":"ContainerStarted","Data":"d45ad07402ed3707ff757664f5dd537acebaa3859a354fc6aaf239ffcc19cfb1"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.555880 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" event={"ID":"0dda1581-f45b-42cd-840f-9b8f2d7a48b1","Type":"ContainerStarted","Data":"2e2b3179c5e177bb50aadec58e91e01f13fe5c3c0786983edc257bee0bee1f37"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.556390 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-km89l" podStartSLOduration=27.533142831 podStartE2EDuration="47.556369598s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.02042669 +0000 UTC m=+939.407947417" lastFinishedPulling="2025-12-03 17:16:05.043653457 +0000 UTC m=+959.431174184" observedRunningTime="2025-12-03 17:16:30.554392592 +0000 UTC m=+984.941913319" watchObservedRunningTime="2025-12-03 17:16:30.556369598 +0000 UTC m=+984.943890325" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.560754 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" event={"ID":"b24334e0-1dd6-4667-8ce1-6013cc71dd7f","Type":"ContainerStarted","Data":"5c2053b41bdb051e878aa760d85f7787b61da708233ad43428ffe1f7bceba324"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.567076 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8vhqc" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.567541 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8vhqc" event={"ID":"a9cc4d57-7fab-4a4e-9a5d-b334918f4c49","Type":"ContainerDied","Data":"fed136fb6f121cf45ff305f41935596ab8b3a8a49e4f032c3429c8e86f161ee3"} Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.567618 4841 scope.go:117] "RemoveContainer" containerID="e0f3b7d98e35f369a465586fd1765b818c279a275738035fe282ac85c46ebcee" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.623553 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" podStartSLOduration=27.251390643 podStartE2EDuration="47.623530587s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.671802449 +0000 UTC m=+939.059323176" lastFinishedPulling="2025-12-03 17:16:05.043942393 +0000 UTC m=+959.431463120" observedRunningTime="2025-12-03 17:16:30.617858764 +0000 UTC m=+985.005379491" watchObservedRunningTime="2025-12-03 17:16:30.623530587 +0000 UTC m=+985.011051314" Dec 03 17:16:30 crc kubenswrapper[4841]: E1203 17:16:30.626070 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" podUID="7e01626f-e7f3-4c48-bc9b-5d9261b3d89a" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.671356 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" podStartSLOduration=47.671331613 podStartE2EDuration="47.671331613s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:16:30.658251658 +0000 UTC m=+985.045772385" watchObservedRunningTime="2025-12-03 17:16:30.671331613 +0000 UTC m=+985.058852330" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.689447 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-r9rbr" podStartSLOduration=3.51796264 podStartE2EDuration="47.689427656s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.398717934 +0000 UTC m=+939.786238661" lastFinishedPulling="2025-12-03 17:16:29.57018295 +0000 UTC m=+983.957703677" observedRunningTime="2025-12-03 17:16:30.679742879 +0000 UTC m=+985.067263606" watchObservedRunningTime="2025-12-03 17:16:30.689427656 +0000 UTC m=+985.076948373" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.707102 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-627sl" podStartSLOduration=28.027167819 podStartE2EDuration="47.707080158s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.364488925 +0000 UTC m=+939.752009652" lastFinishedPulling="2025-12-03 17:16:05.044401264 +0000 UTC m=+959.431921991" observedRunningTime="2025-12-03 17:16:30.701252272 +0000 UTC m=+985.088773009" watchObservedRunningTime="2025-12-03 17:16:30.707080158 +0000 UTC m=+985.094600885" Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.782035 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.802570 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8vhqc"] Dec 03 17:16:30 crc kubenswrapper[4841]: I1203 17:16:30.802983 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-l56t4" podStartSLOduration=26.572690845 podStartE2EDuration="47.802959027s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.013187431 +0000 UTC m=+939.400708158" lastFinishedPulling="2025-12-03 17:16:06.243455603 +0000 UTC m=+960.630976340" observedRunningTime="2025-12-03 17:16:30.794203472 +0000 UTC m=+985.181724199" watchObservedRunningTime="2025-12-03 17:16:30.802959027 +0000 UTC m=+985.190479754" Dec 03 17:16:32 crc kubenswrapper[4841]: E1203 17:16:32.037124 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-baremetal-operator@sha256:14cfad6ea2e7f7ecc4cb2aafceb9c61514b3d04b66668832d1e4ac3b19f1ab81\\\"\"" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" podUID="7e01626f-e7f3-4c48-bc9b-5d9261b3d89a" Dec 03 17:16:32 crc kubenswrapper[4841]: I1203 17:16:32.260992 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" path="/var/lib/kubelet/pods/a9cc4d57-7fab-4a4e-9a5d-b334918f4c49/volumes" Dec 03 17:16:33 crc kubenswrapper[4841]: I1203 17:16:33.022898 4841 scope.go:117] "RemoveContainer" containerID="97e8e6838d5139d4d6b13fd86fb618817aafe29f1ee1b80fa6b5bd7ac24681ae" Dec 03 17:16:33 crc kubenswrapper[4841]: I1203 17:16:33.292312 4841 scope.go:117] "RemoveContainer" containerID="7dbdbd4e2177da28b59066fa9f42763601f33d01adf99fbfa9b64037c7376ac9" Dec 03 17:16:33 crc kubenswrapper[4841]: I1203 17:16:33.421834 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:16:33 crc kubenswrapper[4841]: I1203 17:16:33.424767 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.205378 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c7f74d46b-4txld" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.633343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" event={"ID":"b9bdf600-ace4-4f28-80c9-3dd36cf449ad","Type":"ContainerStarted","Data":"0e1af3aa8e46f480741d4dcfcc2179786b2296cafb897897f283c08f494a901b"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.637946 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" event={"ID":"7b153ea5-5794-46c6-a3f3-099b3b45dfef","Type":"ContainerStarted","Data":"c670396648cbba295723162d3b875e171a0ed97710d30b66c5400fd1370055f3"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.642772 4841 generic.go:334] "Generic (PLEG): container finished" podID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerID="7b18cf1f75755542f3eb10b87e847fb1c52ab6e838b7a6df37a883ef92e2048a" exitCode=0 Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.642836 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerDied","Data":"7b18cf1f75755542f3eb10b87e847fb1c52ab6e838b7a6df37a883ef92e2048a"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.644704 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.647918 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" event={"ID":"a4eccc19-eb01-4b44-99e0-041144e4b409","Type":"ContainerStarted","Data":"fa402bbaa35a308932230eee05305cc8da39f97be4e6518d4bdae5f0e2a2932d"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.650930 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" event={"ID":"c969bc4d-df07-4ec7-b406-7de0710faca8","Type":"ContainerStarted","Data":"0a1a8d00cf82741fc68996f6d075d3e19c339aeeb3e8c1086164f7a62ee183cb"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.652406 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" event={"ID":"096189c4-aa40-4a3d-b8df-f8dbfa674e08","Type":"ContainerStarted","Data":"4aa0c579083374262f5663c981035cddca15981b55a9a00dc90986d34c99d96f"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.653416 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" event={"ID":"38a95f1c-87ae-4464-b6fa-ad329d17290e","Type":"ContainerStarted","Data":"73912474d76c16a27338c8a476098831922dcacea4b84d9607135f3be7593988"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.654500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" event={"ID":"a6ef72b8-96de-4545-9100-081f42138dff","Type":"ContainerStarted","Data":"cdd23901ffbd9984d483e7fb45cbfeb4589375b58b8a013164594272c126a414"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.655508 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" event={"ID":"6384ded0-4512-4d89-bef4-004339bb019d","Type":"ContainerStarted","Data":"428724e5a7701aef2418519e56445719feb061f62fb9864eee9a48192e4b1310"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.671972 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" event={"ID":"0dda1581-f45b-42cd-840f-9b8f2d7a48b1","Type":"ContainerStarted","Data":"7110a42563537d713e33fb7aa2a70493d3f252b2929fb36ef414d1a43ff06595"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.672548 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.674344 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" event={"ID":"6dbdda39-de04-49e2-8667-58eb77b076b9","Type":"ContainerStarted","Data":"a86364b39113b96ca15bdffc73428be6c2a1d96476b5f0dbae25940f866fa44c"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.676005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" event={"ID":"d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8","Type":"ContainerStarted","Data":"a9be4671d6e8b39488ddb92ea02fbb2adcaa98585ccddb97e9c604e3aad3f532"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.676361 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.677208 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" event={"ID":"a253f1cc-d669-490e-9bf4-aff2e95347b0","Type":"ContainerStarted","Data":"88274f351712d77dcd3ce0852062b8e4035bfadb026c0a294ae9b4188bcc629f"} Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.677690 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.706296 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" podStartSLOduration=4.105927179 podStartE2EDuration="53.706279126s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.823150634 +0000 UTC m=+939.210671371" lastFinishedPulling="2025-12-03 17:16:34.423502561 +0000 UTC m=+988.811023318" observedRunningTime="2025-12-03 17:16:36.699559469 +0000 UTC m=+991.087080196" watchObservedRunningTime="2025-12-03 17:16:36.706279126 +0000 UTC m=+991.093799853" Dec 03 17:16:36 crc kubenswrapper[4841]: I1203 17:16:36.731696 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-bfvv8" podStartSLOduration=5.124360821 podStartE2EDuration="53.731669139s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.685691194 +0000 UTC m=+939.073211921" lastFinishedPulling="2025-12-03 17:16:33.292999512 +0000 UTC m=+987.680520239" observedRunningTime="2025-12-03 17:16:36.727921981 +0000 UTC m=+991.115442708" watchObservedRunningTime="2025-12-03 17:16:36.731669139 +0000 UTC m=+991.119189866" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.707326 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" event={"ID":"096189c4-aa40-4a3d-b8df-f8dbfa674e08","Type":"ContainerStarted","Data":"b03895e06ede91b6bce1ce5ebdf41dbeeced95e620ec518c789e26e98e37806d"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.707525 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.709574 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" event={"ID":"a6ef72b8-96de-4545-9100-081f42138dff","Type":"ContainerStarted","Data":"b5d509f42cf9f41584b436094f2a198189c207ee7ba607b3ae1838fe1bb22a29"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.709767 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.712368 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" event={"ID":"7b153ea5-5794-46c6-a3f3-099b3b45dfef","Type":"ContainerStarted","Data":"f2a55533a0573dd1e751a40c22e77cf0eec68a16a228b632bc0ec5fef05a4b07"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.712511 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.714603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" event={"ID":"cae5c7a3-2395-4cfe-93f2-5a7301c52444","Type":"ContainerStarted","Data":"1b8d1a33b9ddd1923543478c04abb94aaf12bb1f488cd692660c87e950243f77"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.714637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" event={"ID":"cae5c7a3-2395-4cfe-93f2-5a7301c52444","Type":"ContainerStarted","Data":"717c5d53c2827b7f8a6879d85a7a27c7263ec2b0cf05bf0ba7d6a3ccdd4f6da5"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.714810 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.716314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" event={"ID":"6384ded0-4512-4d89-bef4-004339bb019d","Type":"ContainerStarted","Data":"366ab7767b744a2c2026cb6f4270d6551b0e8aadb185ea44a88b478940d8bbbb"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.716485 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.718247 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" event={"ID":"a253f1cc-d669-490e-9bf4-aff2e95347b0","Type":"ContainerStarted","Data":"1f4f184bc9d5d90f31ad38b8a40770b64a2049a04adc5474b72f1158f72de152"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.718389 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.721559 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerStarted","Data":"12a3e01f1657a89b757b1cc247f8b7f59416d3e9690532b82f78448f0801f8e6"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.727156 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" podStartSLOduration=6.140915388 podStartE2EDuration="54.727130954s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.706058149 +0000 UTC m=+939.093578876" lastFinishedPulling="2025-12-03 17:16:33.292273715 +0000 UTC m=+987.679794442" observedRunningTime="2025-12-03 17:16:37.723589932 +0000 UTC m=+992.111110659" watchObservedRunningTime="2025-12-03 17:16:37.727130954 +0000 UTC m=+992.114651681" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.727746 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" event={"ID":"38a95f1c-87ae-4464-b6fa-ad329d17290e","Type":"ContainerStarted","Data":"1ada8fb96d28401d921c6d6fe9977ce84135fa69022bc893027b643f80d293be"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.727892 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.730935 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" event={"ID":"c969bc4d-df07-4ec7-b406-7de0710faca8","Type":"ContainerStarted","Data":"0e270b8915f7a6082c6815d3ec876fce00c9dc1afa03bc41471f92678019420e"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.731652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.737289 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" event={"ID":"1348de54-9137-400b-b3db-b684d9a03dc4","Type":"ContainerStarted","Data":"016469d14fa3ac3023834354a86724aea49f7795607cd7de1342bcef47ab8d5b"} Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.737380 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.740936 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.753303 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" podStartSLOduration=6.253790665 podStartE2EDuration="54.753283425s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.055746025 +0000 UTC m=+939.443266752" lastFinishedPulling="2025-12-03 17:16:33.555238785 +0000 UTC m=+987.942759512" observedRunningTime="2025-12-03 17:16:37.747209213 +0000 UTC m=+992.134729940" watchObservedRunningTime="2025-12-03 17:16:37.753283425 +0000 UTC m=+992.140804152" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.777964 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" podStartSLOduration=5.507456188 podStartE2EDuration="54.777949531s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.218323181 +0000 UTC m=+938.605843908" lastFinishedPulling="2025-12-03 17:16:33.488816524 +0000 UTC m=+987.876337251" observedRunningTime="2025-12-03 17:16:37.773420875 +0000 UTC m=+992.160941602" watchObservedRunningTime="2025-12-03 17:16:37.777949531 +0000 UTC m=+992.165470258" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.807922 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" podStartSLOduration=4.394827246 podStartE2EDuration="54.80789512s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.19129381 +0000 UTC m=+939.578814537" lastFinishedPulling="2025-12-03 17:16:35.604361684 +0000 UTC m=+989.991882411" observedRunningTime="2025-12-03 17:16:37.806422906 +0000 UTC m=+992.193943633" watchObservedRunningTime="2025-12-03 17:16:37.80789512 +0000 UTC m=+992.195415847" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.830262 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z9pgf" podStartSLOduration=9.168444511 podStartE2EDuration="15.830248762s" podCreationTimestamp="2025-12-03 17:16:22 +0000 UTC" firstStartedPulling="2025-12-03 17:16:30.655029782 +0000 UTC m=+985.042550509" lastFinishedPulling="2025-12-03 17:16:37.316834033 +0000 UTC m=+991.704354760" observedRunningTime="2025-12-03 17:16:37.824702473 +0000 UTC m=+992.212223200" watchObservedRunningTime="2025-12-03 17:16:37.830248762 +0000 UTC m=+992.217769489" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.855077 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" podStartSLOduration=3.6751350609999998 podStartE2EDuration="54.855061542s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.66753716 +0000 UTC m=+939.055057887" lastFinishedPulling="2025-12-03 17:16:35.847463651 +0000 UTC m=+990.234984368" observedRunningTime="2025-12-03 17:16:37.850445574 +0000 UTC m=+992.237966311" watchObservedRunningTime="2025-12-03 17:16:37.855061542 +0000 UTC m=+992.242582269" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.868503 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" podStartSLOduration=6.389588516 podStartE2EDuration="54.868486685s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.813401836 +0000 UTC m=+939.200922563" lastFinishedPulling="2025-12-03 17:16:33.292299995 +0000 UTC m=+987.679820732" observedRunningTime="2025-12-03 17:16:37.866611961 +0000 UTC m=+992.254132688" watchObservedRunningTime="2025-12-03 17:16:37.868486685 +0000 UTC m=+992.256007412" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.888418 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" podStartSLOduration=6.741886172 podStartE2EDuration="54.8884001s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.776025123 +0000 UTC m=+939.163545850" lastFinishedPulling="2025-12-03 17:16:32.922539021 +0000 UTC m=+987.310059778" observedRunningTime="2025-12-03 17:16:37.887342595 +0000 UTC m=+992.274863322" watchObservedRunningTime="2025-12-03 17:16:37.8884001 +0000 UTC m=+992.275920827" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.902790 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-q9b79" podStartSLOduration=6.796914098 podStartE2EDuration="54.902777616s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.382651569 +0000 UTC m=+939.770172296" lastFinishedPulling="2025-12-03 17:16:33.488515087 +0000 UTC m=+987.876035814" observedRunningTime="2025-12-03 17:16:37.901565328 +0000 UTC m=+992.289086045" watchObservedRunningTime="2025-12-03 17:16:37.902777616 +0000 UTC m=+992.290298343" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.927888 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-qgzs7" podStartSLOduration=34.178648214 podStartE2EDuration="54.927873012s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.294246564 +0000 UTC m=+938.681767291" lastFinishedPulling="2025-12-03 17:16:05.043471362 +0000 UTC m=+959.430992089" observedRunningTime="2025-12-03 17:16:37.921525464 +0000 UTC m=+992.309046201" watchObservedRunningTime="2025-12-03 17:16:37.927873012 +0000 UTC m=+992.315393739" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.958811 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" podStartSLOduration=26.199655715 podStartE2EDuration="54.958794364s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:16:06.858355782 +0000 UTC m=+961.245876509" lastFinishedPulling="2025-12-03 17:16:35.617494421 +0000 UTC m=+990.005015158" observedRunningTime="2025-12-03 17:16:37.954978475 +0000 UTC m=+992.342499202" watchObservedRunningTime="2025-12-03 17:16:37.958794364 +0000 UTC m=+992.346315081" Dec 03 17:16:37 crc kubenswrapper[4841]: I1203 17:16:37.991879 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" podStartSLOduration=5.8362417749999995 podStartE2EDuration="54.991864596s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.421125027 +0000 UTC m=+939.808645744" lastFinishedPulling="2025-12-03 17:16:34.576747838 +0000 UTC m=+988.964268565" observedRunningTime="2025-12-03 17:16:37.977761957 +0000 UTC m=+992.365282694" watchObservedRunningTime="2025-12-03 17:16:37.991864596 +0000 UTC m=+992.379385323" Dec 03 17:16:38 crc kubenswrapper[4841]: I1203 17:16:38.743039 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:16:38 crc kubenswrapper[4841]: I1203 17:16:38.744728 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-65c59f5d56-72jtb" Dec 03 17:16:38 crc kubenswrapper[4841]: I1203 17:16:38.764643 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" podStartSLOduration=7.100953725 podStartE2EDuration="55.76462471s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:44.628460017 +0000 UTC m=+939.015980744" lastFinishedPulling="2025-12-03 17:16:33.292131002 +0000 UTC m=+987.679651729" observedRunningTime="2025-12-03 17:16:37.99759816 +0000 UTC m=+992.385118887" watchObservedRunningTime="2025-12-03 17:16:38.76462471 +0000 UTC m=+993.152145447" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.316897 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.317162 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.317233 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.318012 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.318108 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a" gracePeriod=600 Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.466834 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.476181 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-s2b4r" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.754963 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a" exitCode=0 Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.755056 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a"} Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.755097 4841 scope.go:117] "RemoveContainer" containerID="1b26372173031353039ef7a8dc0bcb0ae6765d7cf65cf0a4fe3dfc913d879b03" Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.757891 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" event={"ID":"56b976ca-c419-42f4-b063-c0219f4e0a72","Type":"ContainerStarted","Data":"94fb634d20f5ec1a5d0d3b55ca1b6214cc4dfa6eac42d0cede66adfcb27974bc"} Dec 03 17:16:39 crc kubenswrapper[4841]: I1203 17:16:39.779796 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-fnjsz" podStartSLOduration=37.249745275 podStartE2EDuration="56.779773715s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:15:45.514064317 +0000 UTC m=+939.901585034" lastFinishedPulling="2025-12-03 17:16:05.044092747 +0000 UTC m=+959.431613474" observedRunningTime="2025-12-03 17:16:39.775421264 +0000 UTC m=+994.162942031" watchObservedRunningTime="2025-12-03 17:16:39.779773715 +0000 UTC m=+994.167294452" Dec 03 17:16:40 crc kubenswrapper[4841]: I1203 17:16:40.771624 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d"} Dec 03 17:16:42 crc kubenswrapper[4841]: I1203 17:16:42.894712 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:42 crc kubenswrapper[4841]: I1203 17:16:42.895166 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:42 crc kubenswrapper[4841]: I1203 17:16:42.959775 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.373641 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-hwp8p" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.449797 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-jmzr6" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.502244 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-z8zb5" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.566429 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-8jcwp" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.811032 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-fj8w7" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.862831 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-jfddn" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.912919 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:43 crc kubenswrapper[4841]: I1203 17:16:43.931290 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" Dec 03 17:16:44 crc kubenswrapper[4841]: I1203 17:16:44.018032 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-sl5jm" Dec 03 17:16:44 crc kubenswrapper[4841]: I1203 17:16:44.066459 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-2d7g6" Dec 03 17:16:44 crc kubenswrapper[4841]: I1203 17:16:44.203391 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:45 crc kubenswrapper[4841]: I1203 17:16:45.836443 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z9pgf" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="registry-server" containerID="cri-o://12a3e01f1657a89b757b1cc247f8b7f59416d3e9690532b82f78448f0801f8e6" gracePeriod=2 Dec 03 17:16:47 crc kubenswrapper[4841]: I1203 17:16:47.855228 4841 generic.go:334] "Generic (PLEG): container finished" podID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerID="12a3e01f1657a89b757b1cc247f8b7f59416d3e9690532b82f78448f0801f8e6" exitCode=0 Dec 03 17:16:47 crc kubenswrapper[4841]: I1203 17:16:47.855498 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerDied","Data":"12a3e01f1657a89b757b1cc247f8b7f59416d3e9690532b82f78448f0801f8e6"} Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.306782 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.473072 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgkt2\" (UniqueName: \"kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2\") pod \"119786cb-9b22-466d-a3a2-ac9fd8465731\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.473152 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities\") pod \"119786cb-9b22-466d-a3a2-ac9fd8465731\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.473187 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content\") pod \"119786cb-9b22-466d-a3a2-ac9fd8465731\" (UID: \"119786cb-9b22-466d-a3a2-ac9fd8465731\") " Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.474534 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities" (OuterVolumeSpecName: "utilities") pod "119786cb-9b22-466d-a3a2-ac9fd8465731" (UID: "119786cb-9b22-466d-a3a2-ac9fd8465731"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.480258 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2" (OuterVolumeSpecName: "kube-api-access-lgkt2") pod "119786cb-9b22-466d-a3a2-ac9fd8465731" (UID: "119786cb-9b22-466d-a3a2-ac9fd8465731"). InnerVolumeSpecName "kube-api-access-lgkt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.499958 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "119786cb-9b22-466d-a3a2-ac9fd8465731" (UID: "119786cb-9b22-466d-a3a2-ac9fd8465731"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.574667 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgkt2\" (UniqueName: \"kubernetes.io/projected/119786cb-9b22-466d-a3a2-ac9fd8465731-kube-api-access-lgkt2\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.574708 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.574723 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/119786cb-9b22-466d-a3a2-ac9fd8465731-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.866368 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" event={"ID":"7e01626f-e7f3-4c48-bc9b-5d9261b3d89a","Type":"ContainerStarted","Data":"4ebe0f976ea1b55c555e2cae362ef309342cebffabec120b96e339c340a3c032"} Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.866577 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.869753 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z9pgf" event={"ID":"119786cb-9b22-466d-a3a2-ac9fd8465731","Type":"ContainerDied","Data":"7ac0e3e6714135fbc22b7127ccc17a77f65a97f4dc9a12043725e51d5d2c799e"} Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.869785 4841 scope.go:117] "RemoveContainer" containerID="12a3e01f1657a89b757b1cc247f8b7f59416d3e9690532b82f78448f0801f8e6" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.869870 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z9pgf" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.897976 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" podStartSLOduration=29.071652507 podStartE2EDuration="1m5.897954085s" podCreationTimestamp="2025-12-03 17:15:43 +0000 UTC" firstStartedPulling="2025-12-03 17:16:11.058463299 +0000 UTC m=+965.445984026" lastFinishedPulling="2025-12-03 17:16:47.884764877 +0000 UTC m=+1002.272285604" observedRunningTime="2025-12-03 17:16:48.892108309 +0000 UTC m=+1003.279629046" watchObservedRunningTime="2025-12-03 17:16:48.897954085 +0000 UTC m=+1003.285474812" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.902794 4841 scope.go:117] "RemoveContainer" containerID="7b18cf1f75755542f3eb10b87e847fb1c52ab6e838b7a6df37a883ef92e2048a" Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.921021 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.928710 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z9pgf"] Dec 03 17:16:48 crc kubenswrapper[4841]: I1203 17:16:48.933674 4841 scope.go:117] "RemoveContainer" containerID="e8967e0176fa2bd56ee254141ebfccb2a6d04007592126cf3184944df247d36e" Dec 03 17:16:50 crc kubenswrapper[4841]: I1203 17:16:50.253758 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" path="/var/lib/kubelet/pods/119786cb-9b22-466d-a3a2-ac9fd8465731/volumes" Dec 03 17:16:59 crc kubenswrapper[4841]: I1203 17:16:59.657380 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.520264 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521197 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="extract-utilities" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521216 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="extract-utilities" Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521252 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="extract-utilities" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521260 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="extract-utilities" Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521277 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="extract-content" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521286 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="extract-content" Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521298 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521306 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521333 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521340 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: E1203 17:17:14.521368 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="extract-content" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521397 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="extract-content" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521550 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="119786cb-9b22-466d-a3a2-ac9fd8465731" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.521575 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9cc4d57-7fab-4a4e-9a5d-b334918f4c49" containerName="registry-server" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.522552 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.524816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-2ssgn" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.525045 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.525215 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.526817 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.539010 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.563200 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.563328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wgnr\" (UniqueName: \"kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.625680 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.627398 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.630469 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.650229 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.664066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wgnr\" (UniqueName: \"kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.664115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t69f\" (UniqueName: \"kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.664146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.664170 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.664284 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.665048 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.712968 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wgnr\" (UniqueName: \"kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr\") pod \"dnsmasq-dns-675f4bcbfc-cwwkc\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.765531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.765636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.765710 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5t69f\" (UniqueName: \"kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.766484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.766533 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.795785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t69f\" (UniqueName: \"kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f\") pod \"dnsmasq-dns-78dd6ddcc-pr2gh\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.851209 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:14 crc kubenswrapper[4841]: I1203 17:17:14.943870 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:15 crc kubenswrapper[4841]: I1203 17:17:15.323893 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:15 crc kubenswrapper[4841]: I1203 17:17:15.358705 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:15 crc kubenswrapper[4841]: W1203 17:17:15.364428 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf3440bc_e009_4bd4_8390_bf798b62db20.slice/crio-af6a41f5af5dbe2e3073824ebf04c9edc690d73cb104e658f480bb7678b44f2d WatchSource:0}: Error finding container af6a41f5af5dbe2e3073824ebf04c9edc690d73cb104e658f480bb7678b44f2d: Status 404 returned error can't find the container with id af6a41f5af5dbe2e3073824ebf04c9edc690d73cb104e658f480bb7678b44f2d Dec 03 17:17:16 crc kubenswrapper[4841]: I1203 17:17:16.117178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" event={"ID":"722e6d91-a445-4121-afd3-8c06ab5d7ce4","Type":"ContainerStarted","Data":"b5a64e4a3278ad19c5f39193ee4fbae48273b7d614c0de4742317b48a9034908"} Dec 03 17:17:16 crc kubenswrapper[4841]: I1203 17:17:16.118970 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" event={"ID":"bf3440bc-e009-4bd4-8390-bf798b62db20","Type":"ContainerStarted","Data":"af6a41f5af5dbe2e3073824ebf04c9edc690d73cb104e658f480bb7678b44f2d"} Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.698976 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.739886 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.742388 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.768539 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.813736 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp77w\" (UniqueName: \"kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.813807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.813932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.915330 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.915428 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wp77w\" (UniqueName: \"kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.915466 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.916408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.917808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:17 crc kubenswrapper[4841]: I1203 17:17:17.970530 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp77w\" (UniqueName: \"kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w\") pod \"dnsmasq-dns-666b6646f7-h749r\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.081530 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.092799 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.126123 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.127218 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.143045 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.334121 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.334235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwxxv\" (UniqueName: \"kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.334301 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.437997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.438092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwxxv\" (UniqueName: \"kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.438141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.439562 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.439962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.468666 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwxxv\" (UniqueName: \"kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv\") pod \"dnsmasq-dns-57d769cc4f-fsj22\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.478297 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.878875 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.880454 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.882887 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.882964 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5j9kb" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.883175 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.883196 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.883431 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.883497 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.884571 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 17:17:18 crc kubenswrapper[4841]: I1203 17:17:18.903410 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.046964 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047220 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbf9t\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047391 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047593 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047675 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.047766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149694 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149767 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149815 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149930 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149953 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.149988 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.150026 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbf9t\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.150058 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.150103 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.150306 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.150372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.151075 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.151938 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.152455 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.154508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.154709 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.155675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.156447 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.157516 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.170067 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbf9t\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.179808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.203026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.239146 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.240787 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.246332 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.246550 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.246704 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.249026 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.249099 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.249536 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vx5zq" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.256299 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.256400 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.352982 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95nhj\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353724 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353759 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353795 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.353817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456122 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456166 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456189 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456205 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456236 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456262 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456336 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95nhj\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.456392 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.457102 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.457512 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.457881 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.458398 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.459051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.461111 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.462538 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.466446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.482389 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.484627 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.484766 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95nhj\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj\") pod \"rabbitmq-cell1-server-0\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:19 crc kubenswrapper[4841]: I1203 17:17:19.576481 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.632154 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.642936 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.644653 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.645429 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.645926 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.646340 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-km978" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.647777 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.661866 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783567 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wmb8\" (UniqueName: \"kubernetes.io/projected/7c3685ed-a2fd-4f00-9452-70f9713117b3-kube-api-access-5wmb8\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783666 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783708 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783741 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783779 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783808 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.783836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wmb8\" (UniqueName: \"kubernetes.io/projected/7c3685ed-a2fd-4f00-9452-70f9713117b3-kube-api-access-5wmb8\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885198 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885269 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885301 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885338 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.885947 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.887305 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.887511 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.888555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.889355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-config-data-default\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.892424 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c3685ed-a2fd-4f00-9452-70f9713117b3-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.894703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7c3685ed-a2fd-4f00-9452-70f9713117b3-kolla-config\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.908845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wmb8\" (UniqueName: \"kubernetes.io/projected/7c3685ed-a2fd-4f00-9452-70f9713117b3-kube-api-access-5wmb8\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.926856 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"7c3685ed-a2fd-4f00-9452-70f9713117b3\") " pod="openstack/openstack-galera-0" Dec 03 17:17:20 crc kubenswrapper[4841]: I1203 17:17:20.972797 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.151592 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.153461 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.155669 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.155942 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.163978 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-6lxwc" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.164464 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.216635 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.285056 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.285976 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.288076 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.290868 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-8j6r7" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.291365 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312262 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c890c-5616-49ab-afd8-59fa071147b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55bst\" (UniqueName: \"kubernetes.io/projected/382c890c-5616-49ab-afd8-59fa071147b4-kube-api-access-55bst\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312575 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.312745 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.339361 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.414510 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55bst\" (UniqueName: \"kubernetes.io/projected/382c890c-5616-49ab-afd8-59fa071147b4-kube-api-access-55bst\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.414864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.414893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415118 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415296 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415334 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-kolla-config\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415359 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-config-data\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415418 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c890c-5616-49ab-afd8-59fa071147b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415601 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7lfp\" (UniqueName: \"kubernetes.io/projected/81cb6c96-a5d5-4120-8cb3-101344626b07-kube-api-access-t7lfp\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.415657 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.416142 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/382c890c-5616-49ab-afd8-59fa071147b4-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.416479 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.416621 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/382c890c-5616-49ab-afd8-59fa071147b4-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.421186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.421255 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/382c890c-5616-49ab-afd8-59fa071147b4-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.436842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55bst\" (UniqueName: \"kubernetes.io/projected/382c890c-5616-49ab-afd8-59fa071147b4-kube-api-access-55bst\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.471999 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"382c890c-5616-49ab-afd8-59fa071147b4\") " pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.512321 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.517086 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7lfp\" (UniqueName: \"kubernetes.io/projected/81cb6c96-a5d5-4120-8cb3-101344626b07-kube-api-access-t7lfp\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.517177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-kolla-config\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.517201 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-config-data\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.517215 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.517257 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.518647 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-config-data\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.520741 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81cb6c96-a5d5-4120-8cb3-101344626b07-kolla-config\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.521349 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.521367 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81cb6c96-a5d5-4120-8cb3-101344626b07-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.537186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7lfp\" (UniqueName: \"kubernetes.io/projected/81cb6c96-a5d5-4120-8cb3-101344626b07-kube-api-access-t7lfp\") pod \"memcached-0\" (UID: \"81cb6c96-a5d5-4120-8cb3-101344626b07\") " pod="openstack/memcached-0" Dec 03 17:17:22 crc kubenswrapper[4841]: I1203 17:17:22.598797 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 17:17:23 crc kubenswrapper[4841]: I1203 17:17:23.960038 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:17:23 crc kubenswrapper[4841]: I1203 17:17:23.961294 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:17:23 crc kubenswrapper[4841]: I1203 17:17:23.964882 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-wbr2c" Dec 03 17:17:23 crc kubenswrapper[4841]: I1203 17:17:23.969374 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:17:24 crc kubenswrapper[4841]: I1203 17:17:24.039837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcc7c\" (UniqueName: \"kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c\") pod \"kube-state-metrics-0\" (UID: \"800c114f-56e0-4bb3-8b43-f6b2f623584a\") " pod="openstack/kube-state-metrics-0" Dec 03 17:17:24 crc kubenswrapper[4841]: I1203 17:17:24.141053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcc7c\" (UniqueName: \"kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c\") pod \"kube-state-metrics-0\" (UID: \"800c114f-56e0-4bb3-8b43-f6b2f623584a\") " pod="openstack/kube-state-metrics-0" Dec 03 17:17:24 crc kubenswrapper[4841]: I1203 17:17:24.158599 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcc7c\" (UniqueName: \"kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c\") pod \"kube-state-metrics-0\" (UID: \"800c114f-56e0-4bb3-8b43-f6b2f623584a\") " pod="openstack/kube-state-metrics-0" Dec 03 17:17:24 crc kubenswrapper[4841]: I1203 17:17:24.289140 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.760036 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cqt22"] Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.761873 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.764541 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.765199 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.766630 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2hlpg" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.772105 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-cj8qw"] Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.774474 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.787835 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cqt22"] Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.797408 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cj8qw"] Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912473 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-lib\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912523 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-log\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-run\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912575 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-combined-ca-bundle\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912745 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-etc-ovs\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912861 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5d7g\" (UniqueName: \"kubernetes.io/projected/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-kube-api-access-b5d7g\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912964 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-scripts\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.912997 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-log-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.913032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjj5t\" (UniqueName: \"kubernetes.io/projected/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-kube-api-access-xjj5t\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.913060 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-scripts\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.913102 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-ovn-controller-tls-certs\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:27 crc kubenswrapper[4841]: I1203 17:17:27.913149 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.014992 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-log\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-run\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-combined-ca-bundle\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015128 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-etc-ovs\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015152 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5d7g\" (UniqueName: \"kubernetes.io/projected/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-kube-api-access-b5d7g\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015180 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-scripts\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-log-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015215 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjj5t\" (UniqueName: \"kubernetes.io/projected/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-kube-api-access-xjj5t\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-scripts\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015253 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-ovn-controller-tls-certs\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-lib\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015816 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-lib\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.015959 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-log\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.016082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-var-run\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.017262 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-etc-ovs\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.017328 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.017429 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-run-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.017570 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-var-log-ovn\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.018935 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-scripts\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.021286 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-scripts\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.024651 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-combined-ca-bundle\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.028967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-ovn-controller-tls-certs\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.038877 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5d7g\" (UniqueName: \"kubernetes.io/projected/41fcd3cb-c81b-4a15-bb57-aa38bfa47e41-kube-api-access-b5d7g\") pod \"ovn-controller-cqt22\" (UID: \"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41\") " pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.040383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjj5t\" (UniqueName: \"kubernetes.io/projected/b3fab8b5-6122-451f-9b66-a1dbb0813c1b-kube-api-access-xjj5t\") pod \"ovn-controller-ovs-cj8qw\" (UID: \"b3fab8b5-6122-451f-9b66-a1dbb0813c1b\") " pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.090805 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22" Dec 03 17:17:28 crc kubenswrapper[4841]: I1203 17:17:28.105191 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.323763 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.325431 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.328224 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.328245 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.328249 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-8bv4m" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.329397 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.330337 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.350831 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.465832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.465873 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nn4n\" (UniqueName: \"kubernetes.io/projected/a85311e5-2270-4d86-a617-1b7da0a346c8-kube-api-access-9nn4n\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466322 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466500 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466555 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.466590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.527822 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.529501 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.532581 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.532795 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lbw4z" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.536573 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.537756 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.543341 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.567808 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nn4n\" (UniqueName: \"kubernetes.io/projected/a85311e5-2270-4d86-a617-1b7da0a346c8-kube-api-access-9nn4n\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.567872 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.567939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.567973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568098 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568419 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.568891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.569628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-config\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.569775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a85311e5-2270-4d86-a617-1b7da0a346c8-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.574209 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.574643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.589744 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nn4n\" (UniqueName: \"kubernetes.io/projected/a85311e5-2270-4d86-a617-1b7da0a346c8-kube-api-access-9nn4n\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.594990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a85311e5-2270-4d86-a617-1b7da0a346c8-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.597933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"ovsdbserver-nb-0\" (UID: \"a85311e5-2270-4d86-a617-1b7da0a346c8\") " pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.653009 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.669992 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670103 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcs42\" (UniqueName: \"kubernetes.io/projected/f9b50c59-2571-4a25-bff5-bc84b18d7315-kube-api-access-qcs42\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670219 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-config\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.670431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772351 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772379 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772410 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772434 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772455 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcs42\" (UniqueName: \"kubernetes.io/projected/f9b50c59-2571-4a25-bff5-bc84b18d7315-kube-api-access-qcs42\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772484 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-config\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.772508 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.773545 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.773642 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.774048 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.774459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9b50c59-2571-4a25-bff5-bc84b18d7315-config\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.777017 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.777168 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.777565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9b50c59-2571-4a25-bff5-bc84b18d7315-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.793710 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.796084 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcs42\" (UniqueName: \"kubernetes.io/projected/f9b50c59-2571-4a25-bff5-bc84b18d7315-kube-api-access-qcs42\") pod \"ovsdbserver-sb-0\" (UID: \"f9b50c59-2571-4a25-bff5-bc84b18d7315\") " pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:31 crc kubenswrapper[4841]: I1203 17:17:31.848344 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:33 crc kubenswrapper[4841]: I1203 17:17:33.824752 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.232579 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.233109 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5t69f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-pr2gh_openstack(722e6d91-a445-4121-afd3-8c06ab5d7ce4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.234368 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" podUID="722e6d91-a445-4121-afd3-8c06ab5d7ce4" Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.235015 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.235155 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5wgnr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-cwwkc_openstack(bf3440bc-e009-4bd4-8390-bf798b62db20): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:17:34 crc kubenswrapper[4841]: E1203 17:17:34.236342 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" podUID="bf3440bc-e009-4bd4-8390-bf798b62db20" Dec 03 17:17:34 crc kubenswrapper[4841]: W1203 17:17:34.254856 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3685ed_a2fd_4f00_9452_70f9713117b3.slice/crio-0ee119e596f88791b91fd559c242538fccf2cf91bfc56c66384d00423b6f54e3 WatchSource:0}: Error finding container 0ee119e596f88791b91fd559c242538fccf2cf91bfc56c66384d00423b6f54e3: Status 404 returned error can't find the container with id 0ee119e596f88791b91fd559c242538fccf2cf91bfc56c66384d00423b6f54e3 Dec 03 17:17:34 crc kubenswrapper[4841]: I1203 17:17:34.344493 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c3685ed-a2fd-4f00-9452-70f9713117b3","Type":"ContainerStarted","Data":"0ee119e596f88791b91fd559c242538fccf2cf91bfc56c66384d00423b6f54e3"} Dec 03 17:17:34 crc kubenswrapper[4841]: I1203 17:17:34.975014 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:34 crc kubenswrapper[4841]: I1203 17:17:34.982657 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.001935 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.039102 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config\") pod \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.039166 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wgnr\" (UniqueName: \"kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr\") pod \"bf3440bc-e009-4bd4-8390-bf798b62db20\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.039283 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5t69f\" (UniqueName: \"kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f\") pod \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.039315 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config\") pod \"bf3440bc-e009-4bd4-8390-bf798b62db20\" (UID: \"bf3440bc-e009-4bd4-8390-bf798b62db20\") " Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.039360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc\") pod \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\" (UID: \"722e6d91-a445-4121-afd3-8c06ab5d7ce4\") " Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.040286 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "722e6d91-a445-4121-afd3-8c06ab5d7ce4" (UID: "722e6d91-a445-4121-afd3-8c06ab5d7ce4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.040315 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config" (OuterVolumeSpecName: "config") pod "722e6d91-a445-4121-afd3-8c06ab5d7ce4" (UID: "722e6d91-a445-4121-afd3-8c06ab5d7ce4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.040664 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config" (OuterVolumeSpecName: "config") pod "bf3440bc-e009-4bd4-8390-bf798b62db20" (UID: "bf3440bc-e009-4bd4-8390-bf798b62db20"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.046828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f" (OuterVolumeSpecName: "kube-api-access-5t69f") pod "722e6d91-a445-4121-afd3-8c06ab5d7ce4" (UID: "722e6d91-a445-4121-afd3-8c06ab5d7ce4"). InnerVolumeSpecName "kube-api-access-5t69f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.048028 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr" (OuterVolumeSpecName: "kube-api-access-5wgnr") pod "bf3440bc-e009-4bd4-8390-bf798b62db20" (UID: "bf3440bc-e009-4bd4-8390-bf798b62db20"). InnerVolumeSpecName "kube-api-access-5wgnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.073431 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod800c114f_56e0_4bb3_8b43_f6b2f623584a.slice/crio-078193a9c68c2825a492f0f3731d568c122c6169ee4e9c8a6581834848783336 WatchSource:0}: Error finding container 078193a9c68c2825a492f0f3731d568c122c6169ee4e9c8a6581834848783336: Status 404 returned error can't find the container with id 078193a9c68c2825a492f0f3731d568c122c6169ee4e9c8a6581834848783336 Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.079959 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.088766 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.092072 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9e8bee6_ec4a_4743_9ca4_62c37c278958.slice/crio-e10de9fe60289ecf1869b4efca723fe3888f9f15595e59300e9c802096f0d8bc WatchSource:0}: Error finding container e10de9fe60289ecf1869b4efca723fe3888f9f15595e59300e9c802096f0d8bc: Status 404 returned error can't find the container with id e10de9fe60289ecf1869b4efca723fe3888f9f15595e59300e9c802096f0d8bc Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.097024 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.100698 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.142564 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf3440bc-e009-4bd4-8390-bf798b62db20-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.142599 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.142608 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/722e6d91-a445-4121-afd3-8c06ab5d7ce4-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.142618 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wgnr\" (UniqueName: \"kubernetes.io/projected/bf3440bc-e009-4bd4-8390-bf798b62db20-kube-api-access-5wgnr\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.142629 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5t69f\" (UniqueName: \"kubernetes.io/projected/722e6d91-a445-4121-afd3-8c06ab5d7ce4-kube-api-access-5t69f\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.260462 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod41fcd3cb_c81b_4a15_bb57_aa38bfa47e41.slice/crio-8460cfac5a33ed8259f9c214c7782b386d6fe399de6f30527e723d1ba4d689d6 WatchSource:0}: Error finding container 8460cfac5a33ed8259f9c214c7782b386d6fe399de6f30527e723d1ba4d689d6: Status 404 returned error can't find the container with id 8460cfac5a33ed8259f9c214c7782b386d6fe399de6f30527e723d1ba4d689d6 Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.269181 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81cb6c96_a5d5_4120_8cb3_101344626b07.slice/crio-c5d04cf8737ff7bcae4d7e2f5aff74fd49b0f3f8569e65cce604ac434923853e WatchSource:0}: Error finding container c5d04cf8737ff7bcae4d7e2f5aff74fd49b0f3f8569e65cce604ac434923853e: Status 404 returned error can't find the container with id c5d04cf8737ff7bcae4d7e2f5aff74fd49b0f3f8569e65cce604ac434923853e Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.273985 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cqt22"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.287559 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.296064 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.296563 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf06bb6b5_7c2d_4fda_ae35_bf0c099491f2.slice/crio-8fff77c9c2f297d0a3d4ae4a4b9468cf8b612b1d2a8f67f96ff68e4f4b49c29d WatchSource:0}: Error finding container 8fff77c9c2f297d0a3d4ae4a4b9468cf8b612b1d2a8f67f96ff68e4f4b49c29d: Status 404 returned error can't find the container with id 8fff77c9c2f297d0a3d4ae4a4b9468cf8b612b1d2a8f67f96ff68e4f4b49c29d Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.356929 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" event={"ID":"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2","Type":"ContainerStarted","Data":"8fff77c9c2f297d0a3d4ae4a4b9468cf8b612b1d2a8f67f96ff68e4f4b49c29d"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.358504 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerStarted","Data":"e10de9fe60289ecf1869b4efca723fe3888f9f15595e59300e9c802096f0d8bc"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.360226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"800c114f-56e0-4bb3-8b43-f6b2f623584a","Type":"ContainerStarted","Data":"078193a9c68c2825a492f0f3731d568c122c6169ee4e9c8a6581834848783336"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.362213 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" event={"ID":"722e6d91-a445-4121-afd3-8c06ab5d7ce4","Type":"ContainerDied","Data":"b5a64e4a3278ad19c5f39193ee4fbae48273b7d614c0de4742317b48a9034908"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.362296 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-pr2gh" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.370130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h749r" event={"ID":"4c42f036-0fe4-4ffc-9084-f6e64da6314c","Type":"ContainerStarted","Data":"ca0d672337c4493d3f05ce445675ae8612826cad1692459b76616f3f8dbbf6d7"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.370989 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerStarted","Data":"bbaac5a23ce473804b5fc66b00c891ee2cd8b581ab87bebe5a52d1ba32fa4e90"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.372865 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c890c-5616-49ab-afd8-59fa071147b4","Type":"ContainerStarted","Data":"243f303c75904a4009568209d066a21cb5460e18f8c2a231a08df430e99fc9c2"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.373994 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"81cb6c96-a5d5-4120-8cb3-101344626b07","Type":"ContainerStarted","Data":"c5d04cf8737ff7bcae4d7e2f5aff74fd49b0f3f8569e65cce604ac434923853e"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.374795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cqt22" event={"ID":"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41","Type":"ContainerStarted","Data":"8460cfac5a33ed8259f9c214c7782b386d6fe399de6f30527e723d1ba4d689d6"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.375758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" event={"ID":"bf3440bc-e009-4bd4-8390-bf798b62db20","Type":"ContainerDied","Data":"af6a41f5af5dbe2e3073824ebf04c9edc690d73cb104e658f480bb7678b44f2d"} Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.375850 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-cwwkc" Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.404752 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.405779 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda85311e5_2270_4d86_a617_1b7da0a346c8.slice/crio-234d245efabb15699655d70f97254d590a4fd8897ea50a6a40bf5c3145943c47 WatchSource:0}: Error finding container 234d245efabb15699655d70f97254d590a4fd8897ea50a6a40bf5c3145943c47: Status 404 returned error can't find the container with id 234d245efabb15699655d70f97254d590a4fd8897ea50a6a40bf5c3145943c47 Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.446078 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.454761 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-pr2gh"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.492367 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.500006 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-cwwkc"] Dec 03 17:17:35 crc kubenswrapper[4841]: I1203 17:17:35.518631 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-cj8qw"] Dec 03 17:17:35 crc kubenswrapper[4841]: W1203 17:17:35.558367 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3fab8b5_6122_451f_9b66_a1dbb0813c1b.slice/crio-c2ce5e32381d851d4bfc27f692fc606a089d76fd17c10339885c61cce11a4571 WatchSource:0}: Error finding container c2ce5e32381d851d4bfc27f692fc606a089d76fd17c10339885c61cce11a4571: Status 404 returned error can't find the container with id c2ce5e32381d851d4bfc27f692fc606a089d76fd17c10339885c61cce11a4571 Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.067321 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.250452 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="722e6d91-a445-4121-afd3-8c06ab5d7ce4" path="/var/lib/kubelet/pods/722e6d91-a445-4121-afd3-8c06ab5d7ce4/volumes" Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.250816 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf3440bc-e009-4bd4-8390-bf798b62db20" path="/var/lib/kubelet/pods/bf3440bc-e009-4bd4-8390-bf798b62db20/volumes" Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.385394 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a85311e5-2270-4d86-a617-1b7da0a346c8","Type":"ContainerStarted","Data":"234d245efabb15699655d70f97254d590a4fd8897ea50a6a40bf5c3145943c47"} Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.387545 4841 generic.go:334] "Generic (PLEG): container finished" podID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerID="ea6a2b5c0c77265661ec76cd7f40fc4451a89ec6a1d1f9882218dcf14fc2c04e" exitCode=0 Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.387649 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h749r" event={"ID":"4c42f036-0fe4-4ffc-9084-f6e64da6314c","Type":"ContainerDied","Data":"ea6a2b5c0c77265661ec76cd7f40fc4451a89ec6a1d1f9882218dcf14fc2c04e"} Dec 03 17:17:36 crc kubenswrapper[4841]: W1203 17:17:36.411683 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9b50c59_2571_4a25_bff5_bc84b18d7315.slice/crio-150c27187da7ae161d868ab7e5b44ca473e8db42ccf0f38f3966c1857348d256 WatchSource:0}: Error finding container 150c27187da7ae161d868ab7e5b44ca473e8db42ccf0f38f3966c1857348d256: Status 404 returned error can't find the container with id 150c27187da7ae161d868ab7e5b44ca473e8db42ccf0f38f3966c1857348d256 Dec 03 17:17:36 crc kubenswrapper[4841]: I1203 17:17:36.414895 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cj8qw" event={"ID":"b3fab8b5-6122-451f-9b66-a1dbb0813c1b","Type":"ContainerStarted","Data":"c2ce5e32381d851d4bfc27f692fc606a089d76fd17c10339885c61cce11a4571"} Dec 03 17:17:37 crc kubenswrapper[4841]: I1203 17:17:37.422801 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f9b50c59-2571-4a25-bff5-bc84b18d7315","Type":"ContainerStarted","Data":"150c27187da7ae161d868ab7e5b44ca473e8db42ccf0f38f3966c1857348d256"} Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.723232 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-6vxfx"] Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.726530 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.729157 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.735344 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6vxfx"] Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813386 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-config\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813406 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6fl\" (UniqueName: \"kubernetes.io/projected/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-kube-api-access-hq6fl\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813440 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovn-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-combined-ca-bundle\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.813495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovs-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.860032 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.886665 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.888012 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.890240 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.908973 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915236 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915288 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-config\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6fl\" (UniqueName: \"kubernetes.io/projected/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-kube-api-access-hq6fl\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915343 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovn-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915367 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-combined-ca-bundle\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915398 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovs-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915418 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkg6c\" (UniqueName: \"kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915461 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915477 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.915502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.918773 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovs-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.919155 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-ovn-rundir\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.919402 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-config\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.928874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-combined-ca-bundle\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.942307 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6fl\" (UniqueName: \"kubernetes.io/projected/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-kube-api-access-hq6fl\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:40 crc kubenswrapper[4841]: I1203 17:17:40.943726 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/805e03d0-c25b-4b59-8d0b-d526bc7fcc85-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-6vxfx\" (UID: \"805e03d0-c25b-4b59-8d0b-d526bc7fcc85\") " pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.004998 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.017365 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkg6c\" (UniqueName: \"kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.017658 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.017762 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.017866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.018546 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.018811 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.021225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.038454 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.039705 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.041778 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.047748 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-6vxfx" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.055502 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkg6c\" (UniqueName: \"kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c\") pod \"dnsmasq-dns-7fd796d7df-bjq5g\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.060565 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.120035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxk7t\" (UniqueName: \"kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.120143 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.120183 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.120201 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.120263 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.207270 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.221842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.221994 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxk7t\" (UniqueName: \"kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.222058 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.222131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.222167 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.223000 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.223152 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.223281 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.223374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.244961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxk7t\" (UniqueName: \"kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t\") pod \"dnsmasq-dns-86db49b7ff-2fgc6\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:41 crc kubenswrapper[4841]: I1203 17:17:41.390801 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:42 crc kubenswrapper[4841]: I1203 17:17:42.652066 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:17:43 crc kubenswrapper[4841]: I1203 17:17:43.472767 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" event={"ID":"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c","Type":"ContainerStarted","Data":"b4f6bdec395e0e6c809ed5cf3c1ea083fcd7bc0a4b215e7fad70f6218479a436"} Dec 03 17:17:43 crc kubenswrapper[4841]: I1203 17:17:43.905326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-6vxfx"] Dec 03 17:17:44 crc kubenswrapper[4841]: W1203 17:17:44.000589 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod805e03d0_c25b_4b59_8d0b_d526bc7fcc85.slice/crio-5f6336f548faf71232894c07ff30d3e3e384090e1d6e52bf923009f8d19f52df WatchSource:0}: Error finding container 5f6336f548faf71232894c07ff30d3e3e384090e1d6e52bf923009f8d19f52df: Status 404 returned error can't find the container with id 5f6336f548faf71232894c07ff30d3e3e384090e1d6e52bf923009f8d19f52df Dec 03 17:17:44 crc kubenswrapper[4841]: I1203 17:17:44.480578 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:17:44 crc kubenswrapper[4841]: I1203 17:17:44.482124 4841 generic.go:334] "Generic (PLEG): container finished" podID="f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" containerID="e83aab14df87e3bd1fbf361e3ec19987c9b2c9c5d29124a69cc1446600fb365c" exitCode=0 Dec 03 17:17:44 crc kubenswrapper[4841]: I1203 17:17:44.482195 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" event={"ID":"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2","Type":"ContainerDied","Data":"e83aab14df87e3bd1fbf361e3ec19987c9b2c9c5d29124a69cc1446600fb365c"} Dec 03 17:17:44 crc kubenswrapper[4841]: I1203 17:17:44.484790 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c890c-5616-49ab-afd8-59fa071147b4","Type":"ContainerStarted","Data":"67563bed56f9bb42f3c255135f12a19561976880576c86890dc8292203e0403e"} Dec 03 17:17:44 crc kubenswrapper[4841]: I1203 17:17:44.487413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6vxfx" event={"ID":"805e03d0-c25b-4b59-8d0b-d526bc7fcc85","Type":"ContainerStarted","Data":"5f6336f548faf71232894c07ff30d3e3e384090e1d6e52bf923009f8d19f52df"} Dec 03 17:17:44 crc kubenswrapper[4841]: W1203 17:17:44.548324 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8edb0d9b_2ead_4079_9dfd_dea987c55c8a.slice/crio-dd8a90121fbd56b75a18de5241acfb33d095ac37b7461931b10c3e2cd1f99c65 WatchSource:0}: Error finding container dd8a90121fbd56b75a18de5241acfb33d095ac37b7461931b10c3e2cd1f99c65: Status 404 returned error can't find the container with id dd8a90121fbd56b75a18de5241acfb33d095ac37b7461931b10c3e2cd1f99c65 Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.154671 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.295737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwxxv\" (UniqueName: \"kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv\") pod \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.295849 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config\") pod \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.296006 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc\") pod \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\" (UID: \"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2\") " Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.305550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv" (OuterVolumeSpecName: "kube-api-access-gwxxv") pod "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" (UID: "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2"). InnerVolumeSpecName "kube-api-access-gwxxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.397953 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwxxv\" (UniqueName: \"kubernetes.io/projected/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-kube-api-access-gwxxv\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.496712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h749r" event={"ID":"4c42f036-0fe4-4ffc-9084-f6e64da6314c","Type":"ContainerStarted","Data":"e073ceaba4d79f66d469606abddf87497f0052e83a3f29acde8be1158e1b5f49"} Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.496833 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-666b6646f7-h749r" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="dnsmasq-dns" containerID="cri-o://e073ceaba4d79f66d469606abddf87497f0052e83a3f29acde8be1158e1b5f49" gracePeriod=10 Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.498275 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.500356 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.500383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-fsj22" event={"ID":"f06bb6b5-7c2d-4fda-ae35-bf0c099491f2","Type":"ContainerDied","Data":"8fff77c9c2f297d0a3d4ae4a4b9468cf8b612b1d2a8f67f96ff68e4f4b49c29d"} Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.500425 4841 scope.go:117] "RemoveContainer" containerID="e83aab14df87e3bd1fbf361e3ec19987c9b2c9c5d29124a69cc1446600fb365c" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.502507 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerStarted","Data":"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a"} Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.505220 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" event={"ID":"8edb0d9b-2ead-4079-9dfd-dea987c55c8a","Type":"ContainerStarted","Data":"dd8a90121fbd56b75a18de5241acfb33d095ac37b7461931b10c3e2cd1f99c65"} Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.514081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" (UID: "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.514299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config" (OuterVolumeSpecName: "config") pod "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" (UID: "f06bb6b5-7c2d-4fda-ae35-bf0c099491f2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.520208 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-666b6646f7-h749r" podStartSLOduration=28.050352416 podStartE2EDuration="28.520190474s" podCreationTimestamp="2025-12-03 17:17:17 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.046356655 +0000 UTC m=+1049.433877382" lastFinishedPulling="2025-12-03 17:17:35.516194713 +0000 UTC m=+1049.903715440" observedRunningTime="2025-12-03 17:17:45.516349514 +0000 UTC m=+1059.903870241" watchObservedRunningTime="2025-12-03 17:17:45.520190474 +0000 UTC m=+1059.907711211" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.603090 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:45 crc kubenswrapper[4841]: I1203 17:17:45.603144 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.004536 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.013030 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-fsj22"] Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.287086 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" path="/var/lib/kubelet/pods/f06bb6b5-7c2d-4fda-ae35-bf0c099491f2/volumes" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.514361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c3685ed-a2fd-4f00-9452-70f9713117b3","Type":"ContainerStarted","Data":"d2324dac73125cb06551b82e6a7a7d3872f4763d736160f52920c38a02d38267"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.517923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"81cb6c96-a5d5-4120-8cb3-101344626b07","Type":"ContainerStarted","Data":"522391f46d25d49a3f61f7038b2c4cc3cb48e5de7e4c1dab2e15d86d6605a642"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.518015 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.521341 4841 generic.go:334] "Generic (PLEG): container finished" podID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerID="9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9" exitCode=0 Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.521410 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" event={"ID":"8edb0d9b-2ead-4079-9dfd-dea987c55c8a","Type":"ContainerDied","Data":"9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.522948 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a85311e5-2270-4d86-a617-1b7da0a346c8","Type":"ContainerStarted","Data":"d96b34a0c01dc0540237e8379003fe4b9addf1ef0def127aa219bd63a9deaa78"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.524356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f9b50c59-2571-4a25-bff5-bc84b18d7315","Type":"ContainerStarted","Data":"10cee799f37f79f34ecaf2f3de867f3e7b22942b4c0da8faaee1707f7c7c65fd"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.530487 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerStarted","Data":"e648f4c2c0cd4cbfa576aed1bd3ce958793bcb6f8ef0a320a51436843de27c16"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.533109 4841 generic.go:334] "Generic (PLEG): container finished" podID="b3fab8b5-6122-451f-9b66-a1dbb0813c1b" containerID="d420532a57bf60a19cdb0bb1e35753e9252d9c763bd48a59c336474ca7d2bbc5" exitCode=0 Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.533201 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cj8qw" event={"ID":"b3fab8b5-6122-451f-9b66-a1dbb0813c1b","Type":"ContainerDied","Data":"d420532a57bf60a19cdb0bb1e35753e9252d9c763bd48a59c336474ca7d2bbc5"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.544148 4841 generic.go:334] "Generic (PLEG): container finished" podID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerID="5d6ba6783e68220ec8eb35934107c7af764061f59bf18445c99eb3463f0bb241" exitCode=0 Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.544269 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" event={"ID":"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c","Type":"ContainerDied","Data":"5d6ba6783e68220ec8eb35934107c7af764061f59bf18445c99eb3463f0bb241"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.549087 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cqt22" event={"ID":"41fcd3cb-c81b-4a15-bb57-aa38bfa47e41","Type":"ContainerStarted","Data":"76369b267d800d4b1622789f5f2aefa69ac1cb9d02f215bca0e6d4dcb6754094"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.550325 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-cqt22" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.552524 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"800c114f-56e0-4bb3-8b43-f6b2f623584a","Type":"ContainerStarted","Data":"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.552816 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.555080 4841 generic.go:334] "Generic (PLEG): container finished" podID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerID="e073ceaba4d79f66d469606abddf87497f0052e83a3f29acde8be1158e1b5f49" exitCode=0 Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.555184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h749r" event={"ID":"4c42f036-0fe4-4ffc-9084-f6e64da6314c","Type":"ContainerDied","Data":"e073ceaba4d79f66d469606abddf87497f0052e83a3f29acde8be1158e1b5f49"} Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.579200 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=16.362190136 podStartE2EDuration="24.579182876s" podCreationTimestamp="2025-12-03 17:17:22 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.272521035 +0000 UTC m=+1049.660041762" lastFinishedPulling="2025-12-03 17:17:43.489513775 +0000 UTC m=+1057.877034502" observedRunningTime="2025-12-03 17:17:46.577585719 +0000 UTC m=+1060.965106446" watchObservedRunningTime="2025-12-03 17:17:46.579182876 +0000 UTC m=+1060.966703603" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.651473 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=14.559576637 podStartE2EDuration="23.651453003s" podCreationTimestamp="2025-12-03 17:17:23 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.084367592 +0000 UTC m=+1049.471888319" lastFinishedPulling="2025-12-03 17:17:44.176243958 +0000 UTC m=+1058.563764685" observedRunningTime="2025-12-03 17:17:46.649865976 +0000 UTC m=+1061.037386703" watchObservedRunningTime="2025-12-03 17:17:46.651453003 +0000 UTC m=+1061.038973730" Dec 03 17:17:46 crc kubenswrapper[4841]: I1203 17:17:46.693368 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-cqt22" podStartSLOduration=10.968905444 podStartE2EDuration="19.693331091s" podCreationTimestamp="2025-12-03 17:17:27 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.272824232 +0000 UTC m=+1049.660344959" lastFinishedPulling="2025-12-03 17:17:43.997249879 +0000 UTC m=+1058.384770606" observedRunningTime="2025-12-03 17:17:46.666507455 +0000 UTC m=+1061.054028182" watchObservedRunningTime="2025-12-03 17:17:46.693331091 +0000 UTC m=+1061.080851818" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.752554 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.858043 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config\") pod \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.858102 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp77w\" (UniqueName: \"kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w\") pod \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.858201 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc\") pod \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\" (UID: \"4c42f036-0fe4-4ffc-9084-f6e64da6314c\") " Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.875640 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w" (OuterVolumeSpecName: "kube-api-access-wp77w") pod "4c42f036-0fe4-4ffc-9084-f6e64da6314c" (UID: "4c42f036-0fe4-4ffc-9084-f6e64da6314c"). InnerVolumeSpecName "kube-api-access-wp77w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.895522 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config" (OuterVolumeSpecName: "config") pod "4c42f036-0fe4-4ffc-9084-f6e64da6314c" (UID: "4c42f036-0fe4-4ffc-9084-f6e64da6314c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.896273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c42f036-0fe4-4ffc-9084-f6e64da6314c" (UID: "4c42f036-0fe4-4ffc-9084-f6e64da6314c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.959961 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.960001 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c42f036-0fe4-4ffc-9084-f6e64da6314c-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:47 crc kubenswrapper[4841]: I1203 17:17:47.960011 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wp77w\" (UniqueName: \"kubernetes.io/projected/4c42f036-0fe4-4ffc-9084-f6e64da6314c-kube-api-access-wp77w\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.577067 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"a85311e5-2270-4d86-a617-1b7da0a346c8","Type":"ContainerStarted","Data":"67fd55dac5be423f4a1336d45e49e6253026c4edd2ea67d5818c454b64bd0cb0"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.581821 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-h749r" event={"ID":"4c42f036-0fe4-4ffc-9084-f6e64da6314c","Type":"ContainerDied","Data":"ca0d672337c4493d3f05ce445675ae8612826cad1692459b76616f3f8dbbf6d7"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.581878 4841 scope.go:117] "RemoveContainer" containerID="e073ceaba4d79f66d469606abddf87497f0052e83a3f29acde8be1158e1b5f49" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.582011 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-h749r" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.587356 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"f9b50c59-2571-4a25-bff5-bc84b18d7315","Type":"ContainerStarted","Data":"11e231f599549a80da4000087f312ca7d28b3e5c802621d3985c826e758e6365"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.589938 4841 generic.go:334] "Generic (PLEG): container finished" podID="382c890c-5616-49ab-afd8-59fa071147b4" containerID="67563bed56f9bb42f3c255135f12a19561976880576c86890dc8292203e0403e" exitCode=0 Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.590009 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c890c-5616-49ab-afd8-59fa071147b4","Type":"ContainerDied","Data":"67563bed56f9bb42f3c255135f12a19561976880576c86890dc8292203e0403e"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.599325 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cj8qw" event={"ID":"b3fab8b5-6122-451f-9b66-a1dbb0813c1b","Type":"ContainerStarted","Data":"8f39718edfdfa718408778575fdbd040a94d1d7ea5d885c5e4a3099bcb5a7821"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.604429 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" event={"ID":"8edb0d9b-2ead-4079-9dfd-dea987c55c8a","Type":"ContainerStarted","Data":"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.604628 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.606029 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=5.910296575 podStartE2EDuration="18.606010175s" podCreationTimestamp="2025-12-03 17:17:30 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.408269004 +0000 UTC m=+1049.795789731" lastFinishedPulling="2025-12-03 17:17:48.103982604 +0000 UTC m=+1062.491503331" observedRunningTime="2025-12-03 17:17:48.592846537 +0000 UTC m=+1062.980367264" watchObservedRunningTime="2025-12-03 17:17:48.606010175 +0000 UTC m=+1062.993530902" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.608723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-6vxfx" event={"ID":"805e03d0-c25b-4b59-8d0b-d526bc7fcc85","Type":"ContainerStarted","Data":"e6cba7e7233ff4f259c34f7d650e4ff6f9e7c8ed551509b6a2885687f7082e05"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.615852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" event={"ID":"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c","Type":"ContainerStarted","Data":"9c3f8f43c26ef9a6f5680631e1ded71643bddbdc483d0d4e052e947f29402cd3"} Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.616063 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.630894 4841 scope.go:117] "RemoveContainer" containerID="ea6a2b5c0c77265661ec76cd7f40fc4451a89ec6a1d1f9882218dcf14fc2c04e" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.664567 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.958995445 podStartE2EDuration="18.664541291s" podCreationTimestamp="2025-12-03 17:17:30 +0000 UTC" firstStartedPulling="2025-12-03 17:17:36.417607856 +0000 UTC m=+1050.805128583" lastFinishedPulling="2025-12-03 17:17:48.123153702 +0000 UTC m=+1062.510674429" observedRunningTime="2025-12-03 17:17:48.653254357 +0000 UTC m=+1063.040775104" watchObservedRunningTime="2025-12-03 17:17:48.664541291 +0000 UTC m=+1063.052062018" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.685759 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-6vxfx" podStartSLOduration=4.590575631 podStartE2EDuration="8.685736656s" podCreationTimestamp="2025-12-03 17:17:40 +0000 UTC" firstStartedPulling="2025-12-03 17:17:44.018147997 +0000 UTC m=+1058.405668724" lastFinishedPulling="2025-12-03 17:17:48.113309022 +0000 UTC m=+1062.500829749" observedRunningTime="2025-12-03 17:17:48.676449379 +0000 UTC m=+1063.063970106" watchObservedRunningTime="2025-12-03 17:17:48.685736656 +0000 UTC m=+1063.073257383" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.705859 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.711834 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-h749r"] Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.732661 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" podStartSLOduration=7.732633651 podStartE2EDuration="7.732633651s" podCreationTimestamp="2025-12-03 17:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:17:48.722125275 +0000 UTC m=+1063.109646082" watchObservedRunningTime="2025-12-03 17:17:48.732633651 +0000 UTC m=+1063.120154388" Dec 03 17:17:48 crc kubenswrapper[4841]: I1203 17:17:48.756805 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" podStartSLOduration=8.756788095 podStartE2EDuration="8.756788095s" podCreationTimestamp="2025-12-03 17:17:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:17:48.752765751 +0000 UTC m=+1063.140286478" watchObservedRunningTime="2025-12-03 17:17:48.756788095 +0000 UTC m=+1063.144308822" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.627833 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"382c890c-5616-49ab-afd8-59fa071147b4","Type":"ContainerStarted","Data":"72bb4f93d5c0602dd0bab60f6989b0760d119bf05532bb98c806ac5fde5b14e3"} Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.631835 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-cj8qw" event={"ID":"b3fab8b5-6122-451f-9b66-a1dbb0813c1b","Type":"ContainerStarted","Data":"5ab0d19265ca7b27e15154b16047a872951960ef4d97bfd39689df694c52ec40"} Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.632028 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.632044 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.634373 4841 generic.go:334] "Generic (PLEG): container finished" podID="7c3685ed-a2fd-4f00-9452-70f9713117b3" containerID="d2324dac73125cb06551b82e6a7a7d3872f4763d736160f52920c38a02d38267" exitCode=0 Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.634512 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c3685ed-a2fd-4f00-9452-70f9713117b3","Type":"ContainerDied","Data":"d2324dac73125cb06551b82e6a7a7d3872f4763d736160f52920c38a02d38267"} Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.653819 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.676103 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=21.499372032 podStartE2EDuration="28.676074976s" podCreationTimestamp="2025-12-03 17:17:21 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.083537623 +0000 UTC m=+1049.471058350" lastFinishedPulling="2025-12-03 17:17:42.260240567 +0000 UTC m=+1056.647761294" observedRunningTime="2025-12-03 17:17:49.659162072 +0000 UTC m=+1064.046682809" watchObservedRunningTime="2025-12-03 17:17:49.676074976 +0000 UTC m=+1064.063595743" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.705415 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.725571 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-cj8qw" podStartSLOduration=16.025526935 podStartE2EDuration="22.725516081s" podCreationTimestamp="2025-12-03 17:17:27 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.562003862 +0000 UTC m=+1049.949524589" lastFinishedPulling="2025-12-03 17:17:42.261993008 +0000 UTC m=+1056.649513735" observedRunningTime="2025-12-03 17:17:49.69165869 +0000 UTC m=+1064.079179417" watchObservedRunningTime="2025-12-03 17:17:49.725516081 +0000 UTC m=+1064.113036828" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.849161 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:49 crc kubenswrapper[4841]: I1203 17:17:49.883071 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.250376 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" path="/var/lib/kubelet/pods/4c42f036-0fe4-4ffc-9084-f6e64da6314c/volumes" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.659928 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"7c3685ed-a2fd-4f00-9452-70f9713117b3","Type":"ContainerStarted","Data":"53b54d34bd5834a1794f2c0b0f11bbd4cba4df7ae4fb22e3093c9d6865fb03d7"} Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.660704 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.660743 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.690889 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.947413149 podStartE2EDuration="31.690871947s" podCreationTimestamp="2025-12-03 17:17:19 +0000 UTC" firstStartedPulling="2025-12-03 17:17:34.258838799 +0000 UTC m=+1048.646359526" lastFinishedPulling="2025-12-03 17:17:44.002297597 +0000 UTC m=+1058.389818324" observedRunningTime="2025-12-03 17:17:50.689319001 +0000 UTC m=+1065.076839728" watchObservedRunningTime="2025-12-03 17:17:50.690871947 +0000 UTC m=+1065.078392674" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.716566 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.716614 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.961837 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 17:17:50 crc kubenswrapper[4841]: E1203 17:17:50.962252 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" containerName="init" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.962268 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" containerName="init" Dec 03 17:17:50 crc kubenswrapper[4841]: E1203 17:17:50.962283 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="init" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.962292 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="init" Dec 03 17:17:50 crc kubenswrapper[4841]: E1203 17:17:50.962308 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="dnsmasq-dns" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.962319 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="dnsmasq-dns" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.962523 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c42f036-0fe4-4ffc-9084-f6e64da6314c" containerName="dnsmasq-dns" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.962540 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f06bb6b5-7c2d-4fda-ae35-bf0c099491f2" containerName="init" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.963603 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.966421 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.966438 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-n8hhc" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.966617 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.972219 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.975288 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.976254 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 17:17:50 crc kubenswrapper[4841]: I1203 17:17:50.976279 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033130 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033201 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcpqj\" (UniqueName: \"kubernetes.io/projected/cbd132af-f941-486f-8791-402bae76197f-kube-api-access-xcpqj\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-scripts\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033420 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbd132af-f941-486f-8791-402bae76197f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-config\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.033495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-config\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135430 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbd132af-f941-486f-8791-402bae76197f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135458 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135804 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.135830 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcpqj\" (UniqueName: \"kubernetes.io/projected/cbd132af-f941-486f-8791-402bae76197f-kube-api-access-xcpqj\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.136285 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/cbd132af-f941-486f-8791-402bae76197f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.136422 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-config\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.136566 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-scripts\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.137104 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cbd132af-f941-486f-8791-402bae76197f-scripts\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.141234 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.141373 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.141476 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/cbd132af-f941-486f-8791-402bae76197f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.166318 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcpqj\" (UniqueName: \"kubernetes.io/projected/cbd132af-f941-486f-8791-402bae76197f-kube-api-access-xcpqj\") pod \"ovn-northd-0\" (UID: \"cbd132af-f941-486f-8791-402bae76197f\") " pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.294543 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 17:17:51 crc kubenswrapper[4841]: W1203 17:17:51.738448 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbd132af_f941_486f_8791_402bae76197f.slice/crio-89c0b0ab92c3978a402882ae0344363ec854c1d146e3dc4af0de8db37070a253 WatchSource:0}: Error finding container 89c0b0ab92c3978a402882ae0344363ec854c1d146e3dc4af0de8db37070a253: Status 404 returned error can't find the container with id 89c0b0ab92c3978a402882ae0344363ec854c1d146e3dc4af0de8db37070a253 Dec 03 17:17:51 crc kubenswrapper[4841]: I1203 17:17:51.738878 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 17:17:52 crc kubenswrapper[4841]: I1203 17:17:52.512538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:52 crc kubenswrapper[4841]: I1203 17:17:52.512845 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 17:17:52 crc kubenswrapper[4841]: I1203 17:17:52.601053 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 17:17:52 crc kubenswrapper[4841]: I1203 17:17:52.715066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbd132af-f941-486f-8791-402bae76197f","Type":"ContainerStarted","Data":"89c0b0ab92c3978a402882ae0344363ec854c1d146e3dc4af0de8db37070a253"} Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.303525 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.448134 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.448459 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" containerID="cri-o://9c3f8f43c26ef9a6f5680631e1ded71643bddbdc483d0d4e052e947f29402cd3" gracePeriod=10 Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.449022 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.517858 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.526100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.540879 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.605417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww448\" (UniqueName: \"kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.605494 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.605650 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.605780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.605830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.707828 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww448\" (UniqueName: \"kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.707902 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.707939 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.707970 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.707995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.708967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.709007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.709040 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.709361 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.728195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww448\" (UniqueName: \"kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448\") pod \"dnsmasq-dns-698758b865-znrqh\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.731426 4841 generic.go:334] "Generic (PLEG): container finished" podID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerID="9c3f8f43c26ef9a6f5680631e1ded71643bddbdc483d0d4e052e947f29402cd3" exitCode=0 Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.731459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" event={"ID":"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c","Type":"ContainerDied","Data":"9c3f8f43c26ef9a6f5680631e1ded71643bddbdc483d0d4e052e947f29402cd3"} Dec 03 17:17:54 crc kubenswrapper[4841]: I1203 17:17:54.857292 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.519825 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.527868 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.530477 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.530674 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-4vsm4" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.530804 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.530916 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.551519 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.623725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.623803 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.623847 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-cache\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.623995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-lock\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.624070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznqf\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-kube-api-access-dznqf\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.725975 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-lock\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726039 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dznqf\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-kube-api-access-dznqf\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726088 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-cache\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: E1203 17:17:55.726313 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:17:55 crc kubenswrapper[4841]: E1203 17:17:55.728212 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726545 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.726972 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-cache\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.727025 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/73a827c8-6b3c-4ffa-9c76-3d3591f38182-lock\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: E1203 17:17:55.728322 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:17:56.228260319 +0000 UTC m=+1070.615781046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.751277 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dznqf\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-kube-api-access-dznqf\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:55 crc kubenswrapper[4841]: I1203 17:17:55.758981 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:56 crc kubenswrapper[4841]: I1203 17:17:56.234057 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:56 crc kubenswrapper[4841]: E1203 17:17:56.234236 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:17:56 crc kubenswrapper[4841]: E1203 17:17:56.234254 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:17:56 crc kubenswrapper[4841]: E1203 17:17:56.234298 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:17:57.234284323 +0000 UTC m=+1071.621805050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:17:56 crc kubenswrapper[4841]: I1203 17:17:56.392105 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:17:57 crc kubenswrapper[4841]: I1203 17:17:57.251536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:57 crc kubenswrapper[4841]: E1203 17:17:57.251689 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:17:57 crc kubenswrapper[4841]: E1203 17:17:57.252610 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:17:57 crc kubenswrapper[4841]: E1203 17:17:57.252712 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:17:59.252695848 +0000 UTC m=+1073.640216575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.286153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:17:59 crc kubenswrapper[4841]: E1203 17:17:59.286353 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:17:59 crc kubenswrapper[4841]: E1203 17:17:59.286370 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:17:59 crc kubenswrapper[4841]: E1203 17:17:59.286419 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:18:03.286404777 +0000 UTC m=+1077.673925504 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.491164 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gpx2h"] Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.492547 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.495515 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.495753 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.496585 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.502818 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-gpx2h"] Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.525274 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gpx2h"] Dec 03 17:17:59 crc kubenswrapper[4841]: E1203 17:17:59.538069 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-j5tbs ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-j5tbs ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-gpx2h" podUID="21312f37-3ef1-43ab-a952-a819f94d6ae4" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.558019 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-hzcwz"] Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.559451 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.567971 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hzcwz"] Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.590966 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5tbs\" (UniqueName: \"kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591045 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591248 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.591471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.692876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.692995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693046 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693243 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693267 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693494 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693560 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.693778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.694007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.694031 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5tbs\" (UniqueName: \"kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.694514 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.699506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.699847 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.704063 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.718532 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5tbs\" (UniqueName: \"kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs\") pod \"swift-ring-rebalance-gpx2h\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.784804 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.795842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.795892 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796005 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796029 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796117 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796124 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.796788 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.797009 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.797345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.798597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.800170 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.800476 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.803574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.813217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5\") pod \"swift-ring-rebalance-hzcwz\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.874232 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898127 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898261 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898294 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898335 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898884 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.898963 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899189 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5tbs\" (UniqueName: \"kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs\") pod \"21312f37-3ef1-43ab-a952-a819f94d6ae4\" (UID: \"21312f37-3ef1-43ab-a952-a819f94d6ae4\") " Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899495 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts" (OuterVolumeSpecName: "scripts") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899708 4841 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899734 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21312f37-3ef1-43ab-a952-a819f94d6ae4-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.899748 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21312f37-3ef1-43ab-a952-a819f94d6ae4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.902697 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.904104 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs" (OuterVolumeSpecName: "kube-api-access-j5tbs") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "kube-api-access-j5tbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.905053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:17:59 crc kubenswrapper[4841]: I1203 17:17:59.906083 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21312f37-3ef1-43ab-a952-a819f94d6ae4" (UID: "21312f37-3ef1-43ab-a952-a819f94d6ae4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.001600 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5tbs\" (UniqueName: \"kubernetes.io/projected/21312f37-3ef1-43ab-a952-a819f94d6ae4-kube-api-access-j5tbs\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.001645 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.001657 4841 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.001667 4841 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21312f37-3ef1-43ab-a952-a819f94d6ae4-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.795747 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gpx2h" Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.850303 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gpx2h"] Dec 03 17:18:00 crc kubenswrapper[4841]: I1203 17:18:00.861982 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gpx2h"] Dec 03 17:18:01 crc kubenswrapper[4841]: I1203 17:18:01.209288 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: i/o timeout" Dec 03 17:18:02 crc kubenswrapper[4841]: I1203 17:18:02.251043 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21312f37-3ef1-43ab-a952-a819f94d6ae4" path="/var/lib/kubelet/pods/21312f37-3ef1-43ab-a952-a819f94d6ae4/volumes" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.325328 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.359148 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:18:03 crc kubenswrapper[4841]: E1203 17:18:03.359857 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:18:03 crc kubenswrapper[4841]: E1203 17:18:03.359899 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:18:03 crc kubenswrapper[4841]: E1203 17:18:03.359976 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:18:11.359958258 +0000 UTC m=+1085.747478985 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.449861 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.519096 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.562637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc\") pod \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.562794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb\") pod \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.562888 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkg6c\" (UniqueName: \"kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c\") pod \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.562982 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config\") pod \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\" (UID: \"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c\") " Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.569798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c" (OuterVolumeSpecName: "kube-api-access-kkg6c") pod "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" (UID: "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c"). InnerVolumeSpecName "kube-api-access-kkg6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.610964 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" (UID: "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.612022 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" (UID: "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.626284 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config" (OuterVolumeSpecName: "config") pod "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" (UID: "25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.664602 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkg6c\" (UniqueName: \"kubernetes.io/projected/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-kube-api-access-kkg6c\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.664641 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.664653 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.664664 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.748781 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.825199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-znrqh" event={"ID":"0d1d957c-03f0-472f-888f-f410cb214bba","Type":"ContainerStarted","Data":"6fb7dc9843eeafde5febd4d60b3c72b2de2bbad7ef5cd675bae90ca03582c364"} Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.825640 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-hzcwz"] Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.828411 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbd132af-f941-486f-8791-402bae76197f","Type":"ContainerStarted","Data":"acb9545e94e2bf8f211f4f1a2f8a15f12f632e99cd65ea0f181360ff24c94523"} Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.828545 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"cbd132af-f941-486f-8791-402bae76197f","Type":"ContainerStarted","Data":"5cf8ec0d4edffa109eecd38c77ca00e12fb31cd4b09e66431be476fc6731555f"} Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.828749 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.836285 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.836276 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" event={"ID":"25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c","Type":"ContainerDied","Data":"b4f6bdec395e0e6c809ed5cf3c1ea083fcd7bc0a4b215e7fad70f6218479a436"} Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.836637 4841 scope.go:117] "RemoveContainer" containerID="9c3f8f43c26ef9a6f5680631e1ded71643bddbdc483d0d4e052e947f29402cd3" Dec 03 17:18:03 crc kubenswrapper[4841]: W1203 17:18:03.858768 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8be83cad_e31a_463f_9eca_837549c69fba.slice/crio-56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4 WatchSource:0}: Error finding container 56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4: Status 404 returned error can't find the container with id 56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4 Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.859115 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.297466355 podStartE2EDuration="13.859102971s" podCreationTimestamp="2025-12-03 17:17:50 +0000 UTC" firstStartedPulling="2025-12-03 17:17:51.741952446 +0000 UTC m=+1066.129473183" lastFinishedPulling="2025-12-03 17:18:03.303589072 +0000 UTC m=+1077.691109799" observedRunningTime="2025-12-03 17:18:03.850335166 +0000 UTC m=+1078.237855903" watchObservedRunningTime="2025-12-03 17:18:03.859102971 +0000 UTC m=+1078.246623698" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.880129 4841 scope.go:117] "RemoveContainer" containerID="5d6ba6783e68220ec8eb35934107c7af764061f59bf18445c99eb3463f0bb241" Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.881799 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:18:03 crc kubenswrapper[4841]: I1203 17:18:03.889474 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-bjq5g"] Dec 03 17:18:04 crc kubenswrapper[4841]: I1203 17:18:04.256512 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" path="/var/lib/kubelet/pods/25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c/volumes" Dec 03 17:18:04 crc kubenswrapper[4841]: I1203 17:18:04.847135 4841 generic.go:334] "Generic (PLEG): container finished" podID="0d1d957c-03f0-472f-888f-f410cb214bba" containerID="dd07afb0b783f3ea965266c320416a3dab1c19712d9646d949f37f48af741f70" exitCode=0 Dec 03 17:18:04 crc kubenswrapper[4841]: I1203 17:18:04.847200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-znrqh" event={"ID":"0d1d957c-03f0-472f-888f-f410cb214bba","Type":"ContainerDied","Data":"dd07afb0b783f3ea965266c320416a3dab1c19712d9646d949f37f48af741f70"} Dec 03 17:18:04 crc kubenswrapper[4841]: I1203 17:18:04.848930 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzcwz" event={"ID":"8be83cad-e31a-463f-9eca-837549c69fba","Type":"ContainerStarted","Data":"56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4"} Dec 03 17:18:05 crc kubenswrapper[4841]: I1203 17:18:05.137251 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 17:18:05 crc kubenswrapper[4841]: I1203 17:18:05.225600 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 17:18:05 crc kubenswrapper[4841]: I1203 17:18:05.859757 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-znrqh" event={"ID":"0d1d957c-03f0-472f-888f-f410cb214bba","Type":"ContainerStarted","Data":"f7c37848c1193c51c133fd068f98e947890b68d25d11f0a02566abbca90f818a"} Dec 03 17:18:05 crc kubenswrapper[4841]: I1203 17:18:05.859889 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:18:05 crc kubenswrapper[4841]: I1203 17:18:05.883927 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-znrqh" podStartSLOduration=11.883887291 podStartE2EDuration="11.883887291s" podCreationTimestamp="2025-12-03 17:17:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:05.878018034 +0000 UTC m=+1080.265538761" watchObservedRunningTime="2025-12-03 17:18:05.883887291 +0000 UTC m=+1080.271408018" Dec 03 17:18:06 crc kubenswrapper[4841]: I1203 17:18:06.209804 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7fd796d7df-bjq5g" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.109:5353: i/o timeout" Dec 03 17:18:07 crc kubenswrapper[4841]: I1203 17:18:07.883042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzcwz" event={"ID":"8be83cad-e31a-463f-9eca-837549c69fba","Type":"ContainerStarted","Data":"f9c8df7e9d20af8562d306f9f9a1e4a38d776b81c058efdba891dfc27e4e076a"} Dec 03 17:18:07 crc kubenswrapper[4841]: I1203 17:18:07.910137 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-hzcwz" podStartSLOduration=6.011600687 podStartE2EDuration="8.910119405s" podCreationTimestamp="2025-12-03 17:17:59 +0000 UTC" firstStartedPulling="2025-12-03 17:18:03.860736219 +0000 UTC m=+1078.248256946" lastFinishedPulling="2025-12-03 17:18:06.759254947 +0000 UTC m=+1081.146775664" observedRunningTime="2025-12-03 17:18:07.904814011 +0000 UTC m=+1082.292334758" watchObservedRunningTime="2025-12-03 17:18:07.910119405 +0000 UTC m=+1082.297640142" Dec 03 17:18:11 crc kubenswrapper[4841]: I1203 17:18:11.408427 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:18:11 crc kubenswrapper[4841]: E1203 17:18:11.408606 4841 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 17:18:11 crc kubenswrapper[4841]: E1203 17:18:11.409002 4841 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 17:18:11 crc kubenswrapper[4841]: E1203 17:18:11.409081 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift podName:73a827c8-6b3c-4ffa-9c76-3d3591f38182 nodeName:}" failed. No retries permitted until 2025-12-03 17:18:27.40905945 +0000 UTC m=+1101.796580177 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift") pod "swift-storage-0" (UID: "73a827c8-6b3c-4ffa-9c76-3d3591f38182") : configmap "swift-ring-files" not found Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.287559 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ec5e-account-create-update-bl669"] Dec 03 17:18:12 crc kubenswrapper[4841]: E1203 17:18:12.288255 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="init" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.288278 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="init" Dec 03 17:18:12 crc kubenswrapper[4841]: E1203 17:18:12.288315 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.288324 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.288468 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="25ba25d7-ce4c-4123-b0cb-a02b42f8fc3c" containerName="dnsmasq-dns" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.289031 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.292784 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.306984 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-j5km2"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.308073 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.324107 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.324230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrhcj\" (UniqueName: \"kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.324304 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpd2\" (UniqueName: \"kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.324333 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.328854 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec5e-account-create-update-bl669"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.338998 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j5km2"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.426485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrhcj\" (UniqueName: \"kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.426663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpd2\" (UniqueName: \"kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.426715 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.426797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.427874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.428083 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.452661 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-7phr6"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.453795 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.454290 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrhcj\" (UniqueName: \"kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj\") pod \"keystone-db-create-j5km2\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.455089 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpd2\" (UniqueName: \"kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2\") pod \"keystone-ec5e-account-create-update-bl669\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.476014 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7phr6"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.555025 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-961c-account-create-update-t8l55"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.556584 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.558762 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.563326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-961c-account-create-update-t8l55"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.608392 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.624394 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.629269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.629399 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj26l\" (UniqueName: \"kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.735803 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.735878 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.735945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj26l\" (UniqueName: \"kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.736092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd8xk\" (UniqueName: \"kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.737796 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.759984 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-nclcf"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.761278 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.761806 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj26l\" (UniqueName: \"kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l\") pod \"placement-db-create-7phr6\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.767290 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nclcf"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.829724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7phr6" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.838175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.838250 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd8xk\" (UniqueName: \"kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.838287 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zj9k\" (UniqueName: \"kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.838358 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.838982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.860551 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd8xk\" (UniqueName: \"kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk\") pod \"placement-961c-account-create-update-t8l55\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.861330 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-fa9f-account-create-update-fptp5"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.862661 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.864622 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.878853 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fa9f-account-create-update-fptp5"] Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.911764 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.946659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.946758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zj9k\" (UniqueName: \"kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.961350 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:12 crc kubenswrapper[4841]: I1203 17:18:12.971234 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zj9k\" (UniqueName: \"kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k\") pod \"glance-db-create-nclcf\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " pod="openstack/glance-db-create-nclcf" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.048928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.048974 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlxb\" (UniqueName: \"kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.071799 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ec5e-account-create-update-bl669"] Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.110237 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nclcf" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.130532 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-j5km2"] Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.151036 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.151107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlxb\" (UniqueName: \"kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.152005 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.170736 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlxb\" (UniqueName: \"kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb\") pod \"glance-fa9f-account-create-update-fptp5\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.181727 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.249802 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-7phr6"] Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.349396 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-961c-account-create-update-t8l55"] Dec 03 17:18:13 crc kubenswrapper[4841]: I1203 17:18:13.949012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec5e-account-create-update-bl669" event={"ID":"fb0a89b5-7071-4060-a386-ccf821af25ec","Type":"ContainerStarted","Data":"5b272c3a2023ba87f4f231724c302b358f9e53dee22b7377d897cebb26c7500a"} Dec 03 17:18:14 crc kubenswrapper[4841]: W1203 17:18:14.206926 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0bf21be_1f9f_4a32_915f_9b8503211879.slice/crio-012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3 WatchSource:0}: Error finding container 012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3: Status 404 returned error can't find the container with id 012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3 Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.682286 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-fa9f-account-create-update-fptp5"] Dec 03 17:18:14 crc kubenswrapper[4841]: W1203 17:18:14.702112 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9121a48_14ed_4adc_8373_736c53b56e5b.slice/crio-1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046 WatchSource:0}: Error finding container 1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046: Status 404 returned error can't find the container with id 1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046 Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.783316 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-nclcf"] Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.858804 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.917609 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.917866 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="dnsmasq-dns" containerID="cri-o://83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32" gracePeriod=10 Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.957410 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec5e-account-create-update-bl669" event={"ID":"fb0a89b5-7071-4060-a386-ccf821af25ec","Type":"ContainerStarted","Data":"8e91709042f4cd4d5e532608c308c2d5efdcce4cf621d22b7cb9965f57bb17bb"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.958551 4841 generic.go:334] "Generic (PLEG): container finished" podID="a0bf21be-1f9f-4a32-915f-9b8503211879" containerID="d2c934f166963e2218ba250d9fa9ca3b12cf8b7988efe72534d9a762133ac232" exitCode=0 Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.958670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7phr6" event={"ID":"a0bf21be-1f9f-4a32-915f-9b8503211879","Type":"ContainerDied","Data":"d2c934f166963e2218ba250d9fa9ca3b12cf8b7988efe72534d9a762133ac232"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.958738 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7phr6" event={"ID":"a0bf21be-1f9f-4a32-915f-9b8503211879","Type":"ContainerStarted","Data":"012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.960548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nclcf" event={"ID":"931a0b37-5c50-415b-ba38-71e3d2c7e632","Type":"ContainerStarted","Data":"e60541cd7aa96d2de0373ce7bb60189b55f1706e21e0e3a9890784fa76f5bedc"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.961739 4841 generic.go:334] "Generic (PLEG): container finished" podID="8be83cad-e31a-463f-9eca-837549c69fba" containerID="f9c8df7e9d20af8562d306f9f9a1e4a38d776b81c058efdba891dfc27e4e076a" exitCode=0 Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.961802 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzcwz" event={"ID":"8be83cad-e31a-463f-9eca-837549c69fba","Type":"ContainerDied","Data":"f9c8df7e9d20af8562d306f9f9a1e4a38d776b81c058efdba891dfc27e4e076a"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.966006 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j5km2" event={"ID":"b1440ad0-f18c-4f97-8001-3a4aaf316279","Type":"ContainerStarted","Data":"7165555f1237e0b598ec78e1f1bed7f727bffe28edcb072a41dbe91a401b3877"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.966035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j5km2" event={"ID":"b1440ad0-f18c-4f97-8001-3a4aaf316279","Type":"ContainerStarted","Data":"79ceaae1cd4280b70e97dd0286ba6e6dcf729ecca302e9d00918b7cdf3f7a3d5"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.969331 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-961c-account-create-update-t8l55" event={"ID":"f462ad2e-2c46-4083-8a4f-016cdadc719c","Type":"ContainerStarted","Data":"3525fdfda01e10f6ba829f873aecbe0f85518296cf0fa36401eb2ce5c991c5f9"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.969369 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-961c-account-create-update-t8l55" event={"ID":"f462ad2e-2c46-4083-8a4f-016cdadc719c","Type":"ContainerStarted","Data":"ff4ee65c0219d5f0cede3d6446b66694b80af675e6d59f467f7a57aff25328e8"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.970990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fa9f-account-create-update-fptp5" event={"ID":"a9121a48-14ed-4adc-8373-736c53b56e5b","Type":"ContainerStarted","Data":"1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046"} Dec 03 17:18:14 crc kubenswrapper[4841]: I1203 17:18:14.984057 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ec5e-account-create-update-bl669" podStartSLOduration=2.9840361509999997 podStartE2EDuration="2.984036151s" podCreationTimestamp="2025-12-03 17:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:14.972233686 +0000 UTC m=+1089.359754413" watchObservedRunningTime="2025-12-03 17:18:14.984036151 +0000 UTC m=+1089.371556878" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.047305 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-961c-account-create-update-t8l55" podStartSLOduration=3.0472849379999998 podStartE2EDuration="3.047284938s" podCreationTimestamp="2025-12-03 17:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:15.020258297 +0000 UTC m=+1089.407779034" watchObservedRunningTime="2025-12-03 17:18:15.047284938 +0000 UTC m=+1089.434805675" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.300537 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.416763 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config\") pod \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.416860 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc\") pod \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.416941 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxk7t\" (UniqueName: \"kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t\") pod \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.417003 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb\") pod \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.417073 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb\") pod \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\" (UID: \"8edb0d9b-2ead-4079-9dfd-dea987c55c8a\") " Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.428359 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t" (OuterVolumeSpecName: "kube-api-access-dxk7t") pod "8edb0d9b-2ead-4079-9dfd-dea987c55c8a" (UID: "8edb0d9b-2ead-4079-9dfd-dea987c55c8a"). InnerVolumeSpecName "kube-api-access-dxk7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.459177 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "8edb0d9b-2ead-4079-9dfd-dea987c55c8a" (UID: "8edb0d9b-2ead-4079-9dfd-dea987c55c8a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.472791 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "8edb0d9b-2ead-4079-9dfd-dea987c55c8a" (UID: "8edb0d9b-2ead-4079-9dfd-dea987c55c8a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.473361 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "8edb0d9b-2ead-4079-9dfd-dea987c55c8a" (UID: "8edb0d9b-2ead-4079-9dfd-dea987c55c8a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.481472 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config" (OuterVolumeSpecName: "config") pod "8edb0d9b-2ead-4079-9dfd-dea987c55c8a" (UID: "8edb0d9b-2ead-4079-9dfd-dea987c55c8a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.519336 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.519390 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.519409 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxk7t\" (UniqueName: \"kubernetes.io/projected/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-kube-api-access-dxk7t\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.519430 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.519448 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8edb0d9b-2ead-4079-9dfd-dea987c55c8a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.979688 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fa9f-account-create-update-fptp5" event={"ID":"a9121a48-14ed-4adc-8373-736c53b56e5b","Type":"ContainerStarted","Data":"b2b40bfcca0e09f6a264d8ec9b8d1f4c487ba795d36cab64d42af07b4e29be10"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.982357 4841 generic.go:334] "Generic (PLEG): container finished" podID="fb0a89b5-7071-4060-a386-ccf821af25ec" containerID="8e91709042f4cd4d5e532608c308c2d5efdcce4cf621d22b7cb9965f57bb17bb" exitCode=0 Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.982414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec5e-account-create-update-bl669" event={"ID":"fb0a89b5-7071-4060-a386-ccf821af25ec","Type":"ContainerDied","Data":"8e91709042f4cd4d5e532608c308c2d5efdcce4cf621d22b7cb9965f57bb17bb"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.984637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nclcf" event={"ID":"931a0b37-5c50-415b-ba38-71e3d2c7e632","Type":"ContainerStarted","Data":"88df786e863cbef3c18e98ce4e80c65ca7c6eacd28d6df949f8ac5a47e7cbb5d"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.987654 4841 generic.go:334] "Generic (PLEG): container finished" podID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerID="83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32" exitCode=0 Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.987712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" event={"ID":"8edb0d9b-2ead-4079-9dfd-dea987c55c8a","Type":"ContainerDied","Data":"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.987736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" event={"ID":"8edb0d9b-2ead-4079-9dfd-dea987c55c8a","Type":"ContainerDied","Data":"dd8a90121fbd56b75a18de5241acfb33d095ac37b7461931b10c3e2cd1f99c65"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.987753 4841 scope.go:117] "RemoveContainer" containerID="83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.987756 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-2fgc6" Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.990105 4841 generic.go:334] "Generic (PLEG): container finished" podID="b1440ad0-f18c-4f97-8001-3a4aaf316279" containerID="7165555f1237e0b598ec78e1f1bed7f727bffe28edcb072a41dbe91a401b3877" exitCode=0 Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.990166 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j5km2" event={"ID":"b1440ad0-f18c-4f97-8001-3a4aaf316279","Type":"ContainerDied","Data":"7165555f1237e0b598ec78e1f1bed7f727bffe28edcb072a41dbe91a401b3877"} Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.994716 4841 generic.go:334] "Generic (PLEG): container finished" podID="f462ad2e-2c46-4083-8a4f-016cdadc719c" containerID="3525fdfda01e10f6ba829f873aecbe0f85518296cf0fa36401eb2ce5c991c5f9" exitCode=0 Dec 03 17:18:15 crc kubenswrapper[4841]: I1203 17:18:15.995235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-961c-account-create-update-t8l55" event={"ID":"f462ad2e-2c46-4083-8a4f-016cdadc719c","Type":"ContainerDied","Data":"3525fdfda01e10f6ba829f873aecbe0f85518296cf0fa36401eb2ce5c991c5f9"} Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.016800 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-fa9f-account-create-update-fptp5" podStartSLOduration=4.016778802 podStartE2EDuration="4.016778802s" podCreationTimestamp="2025-12-03 17:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:16.006214705 +0000 UTC m=+1090.393735432" watchObservedRunningTime="2025-12-03 17:18:16.016778802 +0000 UTC m=+1090.404299539" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.028207 4841 scope.go:117] "RemoveContainer" containerID="9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.043890 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-nclcf" podStartSLOduration=4.043868064 podStartE2EDuration="4.043868064s" podCreationTimestamp="2025-12-03 17:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:16.035762105 +0000 UTC m=+1090.423282842" watchObservedRunningTime="2025-12-03 17:18:16.043868064 +0000 UTC m=+1090.431388801" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.060818 4841 scope.go:117] "RemoveContainer" containerID="83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32" Dec 03 17:18:16 crc kubenswrapper[4841]: E1203 17:18:16.061296 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32\": container with ID starting with 83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32 not found: ID does not exist" containerID="83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.061330 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32"} err="failed to get container status \"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32\": rpc error: code = NotFound desc = could not find container \"83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32\": container with ID starting with 83dc9724abc558d9933bb24845fd02e66fadead17aa0e64555580d3fc932bb32 not found: ID does not exist" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.061355 4841 scope.go:117] "RemoveContainer" containerID="9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9" Dec 03 17:18:16 crc kubenswrapper[4841]: E1203 17:18:16.061685 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9\": container with ID starting with 9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9 not found: ID does not exist" containerID="9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.061712 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9"} err="failed to get container status \"9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9\": rpc error: code = NotFound desc = could not find container \"9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9\": container with ID starting with 9a9d805f8326af9679aee643e584cbafeaefedada2f3d2c973f0f0c7b7144dd9 not found: ID does not exist" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.116761 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.124510 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-2fgc6"] Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.263682 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" path="/var/lib/kubelet/pods/8edb0d9b-2ead-4079-9dfd-dea987c55c8a/volumes" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.375269 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.425884 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.440662 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.491326 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7phr6" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.536887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.536978 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537015 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537064 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrhcj\" (UniqueName: \"kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj\") pod \"b1440ad0-f18c-4f97-8001-3a4aaf316279\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537120 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537162 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537181 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537282 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts\") pod \"8be83cad-e31a-463f-9eca-837549c69fba\" (UID: \"8be83cad-e31a-463f-9eca-837549c69fba\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.537379 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts\") pod \"b1440ad0-f18c-4f97-8001-3a4aaf316279\" (UID: \"b1440ad0-f18c-4f97-8001-3a4aaf316279\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.538887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.539482 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.540508 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1440ad0-f18c-4f97-8001-3a4aaf316279" (UID: "b1440ad0-f18c-4f97-8001-3a4aaf316279"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.542433 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj" (OuterVolumeSpecName: "kube-api-access-qrhcj") pod "b1440ad0-f18c-4f97-8001-3a4aaf316279" (UID: "b1440ad0-f18c-4f97-8001-3a4aaf316279"). InnerVolumeSpecName "kube-api-access-qrhcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.548241 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.550142 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5" (OuterVolumeSpecName: "kube-api-access-gqkc5") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "kube-api-access-gqkc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.561224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.564304 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts" (OuterVolumeSpecName: "scripts") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.577585 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8be83cad-e31a-463f-9eca-837549c69fba" (UID: "8be83cad-e31a-463f-9eca-837549c69fba"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.638963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj26l\" (UniqueName: \"kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l\") pod \"a0bf21be-1f9f-4a32-915f-9b8503211879\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639081 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts\") pod \"a0bf21be-1f9f-4a32-915f-9b8503211879\" (UID: \"a0bf21be-1f9f-4a32-915f-9b8503211879\") " Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639618 4841 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639637 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639648 4841 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8be83cad-e31a-463f-9eca-837549c69fba-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639659 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrhcj\" (UniqueName: \"kubernetes.io/projected/b1440ad0-f18c-4f97-8001-3a4aaf316279-kube-api-access-qrhcj\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639667 4841 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8be83cad-e31a-463f-9eca-837549c69fba-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639675 4841 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639684 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqkc5\" (UniqueName: \"kubernetes.io/projected/8be83cad-e31a-463f-9eca-837549c69fba-kube-api-access-gqkc5\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639754 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8be83cad-e31a-463f-9eca-837549c69fba-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.639765 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1440ad0-f18c-4f97-8001-3a4aaf316279-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.640107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0bf21be-1f9f-4a32-915f-9b8503211879" (UID: "a0bf21be-1f9f-4a32-915f-9b8503211879"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.643489 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l" (OuterVolumeSpecName: "kube-api-access-rj26l") pod "a0bf21be-1f9f-4a32-915f-9b8503211879" (UID: "a0bf21be-1f9f-4a32-915f-9b8503211879"). InnerVolumeSpecName "kube-api-access-rj26l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.741313 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj26l\" (UniqueName: \"kubernetes.io/projected/a0bf21be-1f9f-4a32-915f-9b8503211879-kube-api-access-rj26l\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:16 crc kubenswrapper[4841]: I1203 17:18:16.741354 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0bf21be-1f9f-4a32-915f-9b8503211879-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.007748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-hzcwz" event={"ID":"8be83cad-e31a-463f-9eca-837549c69fba","Type":"ContainerDied","Data":"56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4"} Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.007865 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56913ba5a8680745f695737afc4b7272d0aa6369e47f7961c6dcd038b9ca49e4" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.007767 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-hzcwz" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.010236 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-j5km2" event={"ID":"b1440ad0-f18c-4f97-8001-3a4aaf316279","Type":"ContainerDied","Data":"79ceaae1cd4280b70e97dd0286ba6e6dcf729ecca302e9d00918b7cdf3f7a3d5"} Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.010294 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79ceaae1cd4280b70e97dd0286ba6e6dcf729ecca302e9d00918b7cdf3f7a3d5" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.010359 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-j5km2" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.014268 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9121a48-14ed-4adc-8373-736c53b56e5b" containerID="b2b40bfcca0e09f6a264d8ec9b8d1f4c487ba795d36cab64d42af07b4e29be10" exitCode=0 Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.014375 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fa9f-account-create-update-fptp5" event={"ID":"a9121a48-14ed-4adc-8373-736c53b56e5b","Type":"ContainerDied","Data":"b2b40bfcca0e09f6a264d8ec9b8d1f4c487ba795d36cab64d42af07b4e29be10"} Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.023162 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-7phr6" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.023216 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-7phr6" event={"ID":"a0bf21be-1f9f-4a32-915f-9b8503211879","Type":"ContainerDied","Data":"012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3"} Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.023250 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="012a279dae955ff36355afaa68b13c7a022aaa23e2a1cd1d61adf932df6bb7e3" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.025098 4841 generic.go:334] "Generic (PLEG): container finished" podID="931a0b37-5c50-415b-ba38-71e3d2c7e632" containerID="88df786e863cbef3c18e98ce4e80c65ca7c6eacd28d6df949f8ac5a47e7cbb5d" exitCode=0 Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.025162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nclcf" event={"ID":"931a0b37-5c50-415b-ba38-71e3d2c7e632","Type":"ContainerDied","Data":"88df786e863cbef3c18e98ce4e80c65ca7c6eacd28d6df949f8ac5a47e7cbb5d"} Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.320094 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.394592 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.454992 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts\") pod \"fb0a89b5-7071-4060-a386-ccf821af25ec\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.455193 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qpd2\" (UniqueName: \"kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2\") pod \"fb0a89b5-7071-4060-a386-ccf821af25ec\" (UID: \"fb0a89b5-7071-4060-a386-ccf821af25ec\") " Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.457171 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fb0a89b5-7071-4060-a386-ccf821af25ec" (UID: "fb0a89b5-7071-4060-a386-ccf821af25ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.461396 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2" (OuterVolumeSpecName: "kube-api-access-9qpd2") pod "fb0a89b5-7071-4060-a386-ccf821af25ec" (UID: "fb0a89b5-7071-4060-a386-ccf821af25ec"). InnerVolumeSpecName "kube-api-access-9qpd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.556425 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd8xk\" (UniqueName: \"kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk\") pod \"f462ad2e-2c46-4083-8a4f-016cdadc719c\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.556606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts\") pod \"f462ad2e-2c46-4083-8a4f-016cdadc719c\" (UID: \"f462ad2e-2c46-4083-8a4f-016cdadc719c\") " Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.557081 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f462ad2e-2c46-4083-8a4f-016cdadc719c" (UID: "f462ad2e-2c46-4083-8a4f-016cdadc719c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.557151 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fb0a89b5-7071-4060-a386-ccf821af25ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.557166 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qpd2\" (UniqueName: \"kubernetes.io/projected/fb0a89b5-7071-4060-a386-ccf821af25ec-kube-api-access-9qpd2\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.560229 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk" (OuterVolumeSpecName: "kube-api-access-pd8xk") pod "f462ad2e-2c46-4083-8a4f-016cdadc719c" (UID: "f462ad2e-2c46-4083-8a4f-016cdadc719c"). InnerVolumeSpecName "kube-api-access-pd8xk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.658491 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd8xk\" (UniqueName: \"kubernetes.io/projected/f462ad2e-2c46-4083-8a4f-016cdadc719c-kube-api-access-pd8xk\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:17 crc kubenswrapper[4841]: I1203 17:18:17.658546 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f462ad2e-2c46-4083-8a4f-016cdadc719c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.055775 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-961c-account-create-update-t8l55" event={"ID":"f462ad2e-2c46-4083-8a4f-016cdadc719c","Type":"ContainerDied","Data":"ff4ee65c0219d5f0cede3d6446b66694b80af675e6d59f467f7a57aff25328e8"} Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.056139 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ff4ee65c0219d5f0cede3d6446b66694b80af675e6d59f467f7a57aff25328e8" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.055817 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-961c-account-create-update-t8l55" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.059334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ec5e-account-create-update-bl669" event={"ID":"fb0a89b5-7071-4060-a386-ccf821af25ec","Type":"ContainerDied","Data":"5b272c3a2023ba87f4f231724c302b358f9e53dee22b7377d897cebb26c7500a"} Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.059526 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b272c3a2023ba87f4f231724c302b358f9e53dee22b7377d897cebb26c7500a" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.059732 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ec5e-account-create-update-bl669" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.064269 4841 generic.go:334] "Generic (PLEG): container finished" podID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerID="b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a" exitCode=0 Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.064369 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerDied","Data":"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a"} Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.162211 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-cqt22" podUID="41fcd3cb-c81b-4a15-bb57-aa38bfa47e41" containerName="ovn-controller" probeResult="failure" output=< Dec 03 17:18:18 crc kubenswrapper[4841]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 17:18:18 crc kubenswrapper[4841]: > Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.173842 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.180463 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-cj8qw" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.416663 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425005 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-cqt22-config-8qvbm"] Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425554 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0bf21be-1f9f-4a32-915f-9b8503211879" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425572 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0bf21be-1f9f-4a32-915f-9b8503211879" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425594 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f462ad2e-2c46-4083-8a4f-016cdadc719c" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425602 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f462ad2e-2c46-4083-8a4f-016cdadc719c" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425626 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1440ad0-f18c-4f97-8001-3a4aaf316279" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425636 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1440ad0-f18c-4f97-8001-3a4aaf316279" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425647 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb0a89b5-7071-4060-a386-ccf821af25ec" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425655 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb0a89b5-7071-4060-a386-ccf821af25ec" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425670 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9121a48-14ed-4adc-8373-736c53b56e5b" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425679 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9121a48-14ed-4adc-8373-736c53b56e5b" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425690 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8be83cad-e31a-463f-9eca-837549c69fba" containerName="swift-ring-rebalance" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425698 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8be83cad-e31a-463f-9eca-837549c69fba" containerName="swift-ring-rebalance" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425714 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="dnsmasq-dns" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425722 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="dnsmasq-dns" Dec 03 17:18:18 crc kubenswrapper[4841]: E1203 17:18:18.425732 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="init" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425741 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="init" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425958 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f462ad2e-2c46-4083-8a4f-016cdadc719c" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425975 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1440ad0-f18c-4f97-8001-3a4aaf316279" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.425991 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0bf21be-1f9f-4a32-915f-9b8503211879" containerName="mariadb-database-create" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.426008 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9121a48-14ed-4adc-8373-736c53b56e5b" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.426026 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb0a89b5-7071-4060-a386-ccf821af25ec" containerName="mariadb-account-create-update" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.426041 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8edb0d9b-2ead-4079-9dfd-dea987c55c8a" containerName="dnsmasq-dns" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.426053 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8be83cad-e31a-463f-9eca-837549c69fba" containerName="swift-ring-rebalance" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.426703 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.429168 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.436507 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cqt22-config-8qvbm"] Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.574883 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wlxb\" (UniqueName: \"kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb\") pod \"a9121a48-14ed-4adc-8373-736c53b56e5b\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575030 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts\") pod \"a9121a48-14ed-4adc-8373-736c53b56e5b\" (UID: \"a9121a48-14ed-4adc-8373-736c53b56e5b\") " Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575288 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575321 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575350 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l42fg\" (UniqueName: \"kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575370 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.575386 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.576326 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a9121a48-14ed-4adc-8373-736c53b56e5b" (UID: "a9121a48-14ed-4adc-8373-736c53b56e5b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.577981 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nclcf" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.578333 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb" (OuterVolumeSpecName: "kube-api-access-2wlxb") pod "a9121a48-14ed-4adc-8373-736c53b56e5b" (UID: "a9121a48-14ed-4adc-8373-736c53b56e5b"). InnerVolumeSpecName "kube-api-access-2wlxb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.676394 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zj9k\" (UniqueName: \"kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k\") pod \"931a0b37-5c50-415b-ba38-71e3d2c7e632\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.676792 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts\") pod \"931a0b37-5c50-415b-ba38-71e3d2c7e632\" (UID: \"931a0b37-5c50-415b-ba38-71e3d2c7e632\") " Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677319 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677455 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677573 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677675 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677498 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677829 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l42fg\" (UniqueName: \"kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.677981 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.678079 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.678139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.678254 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "931a0b37-5c50-415b-ba38-71e3d2c7e632" (UID: "931a0b37-5c50-415b-ba38-71e3d2c7e632"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.678296 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wlxb\" (UniqueName: \"kubernetes.io/projected/a9121a48-14ed-4adc-8373-736c53b56e5b-kube-api-access-2wlxb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.678316 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9121a48-14ed-4adc-8373-736c53b56e5b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.679670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.681400 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k" (OuterVolumeSpecName: "kube-api-access-8zj9k") pod "931a0b37-5c50-415b-ba38-71e3d2c7e632" (UID: "931a0b37-5c50-415b-ba38-71e3d2c7e632"). InnerVolumeSpecName "kube-api-access-8zj9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.704059 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l42fg\" (UniqueName: \"kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg\") pod \"ovn-controller-cqt22-config-8qvbm\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.750000 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.779834 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/931a0b37-5c50-415b-ba38-71e3d2c7e632-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:18 crc kubenswrapper[4841]: I1203 17:18:18.779895 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8zj9k\" (UniqueName: \"kubernetes.io/projected/931a0b37-5c50-415b-ba38-71e3d2c7e632-kube-api-access-8zj9k\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.075956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-fa9f-account-create-update-fptp5" event={"ID":"a9121a48-14ed-4adc-8373-736c53b56e5b","Type":"ContainerDied","Data":"1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046"} Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.076262 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1be29cbb9ee99ac7603a05fd80ce157407dc893abea19a6652431dc71f647046" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.075983 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-fa9f-account-create-update-fptp5" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.078593 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerStarted","Data":"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0"} Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.078813 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.080361 4841 generic.go:334] "Generic (PLEG): container finished" podID="388e49e3-0d92-49a4-a165-810b7ac67577" containerID="e648f4c2c0cd4cbfa576aed1bd3ce958793bcb6f8ef0a320a51436843de27c16" exitCode=0 Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.080459 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerDied","Data":"e648f4c2c0cd4cbfa576aed1bd3ce958793bcb6f8ef0a320a51436843de27c16"} Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.081743 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-nclcf" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.081739 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-nclcf" event={"ID":"931a0b37-5c50-415b-ba38-71e3d2c7e632","Type":"ContainerDied","Data":"e60541cd7aa96d2de0373ce7bb60189b55f1706e21e0e3a9890784fa76f5bedc"} Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.081793 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e60541cd7aa96d2de0373ce7bb60189b55f1706e21e0e3a9890784fa76f5bedc" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.110954 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=53.869716922 podStartE2EDuration="1m2.110938917s" podCreationTimestamp="2025-12-03 17:17:17 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.095093953 +0000 UTC m=+1049.482614680" lastFinishedPulling="2025-12-03 17:17:43.336315948 +0000 UTC m=+1057.723836675" observedRunningTime="2025-12-03 17:18:19.100758689 +0000 UTC m=+1093.488279416" watchObservedRunningTime="2025-12-03 17:18:19.110938917 +0000 UTC m=+1093.498459644" Dec 03 17:18:19 crc kubenswrapper[4841]: I1203 17:18:19.236316 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-cqt22-config-8qvbm"] Dec 03 17:18:19 crc kubenswrapper[4841]: W1203 17:18:19.238312 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod658aa0e7_f197_4504_8cda_30ce35d96d7e.slice/crio-dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8 WatchSource:0}: Error finding container dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8: Status 404 returned error can't find the container with id dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8 Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.101709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerStarted","Data":"b093fdb21f484539e2af54030a2f74d5911234561939db70e5925af20e87f3ef"} Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.102473 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.107667 4841 generic.go:334] "Generic (PLEG): container finished" podID="658aa0e7-f197-4504-8cda-30ce35d96d7e" containerID="f1f3d439d70a1fafa84a9ab6f3659e96689557ef080da416ec8469461dc0733b" exitCode=0 Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.107982 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cqt22-config-8qvbm" event={"ID":"658aa0e7-f197-4504-8cda-30ce35d96d7e","Type":"ContainerDied","Data":"f1f3d439d70a1fafa84a9ab6f3659e96689557ef080da416ec8469461dc0733b"} Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.108066 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cqt22-config-8qvbm" event={"ID":"658aa0e7-f197-4504-8cda-30ce35d96d7e","Type":"ContainerStarted","Data":"dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8"} Dec 03 17:18:20 crc kubenswrapper[4841]: I1203 17:18:20.165049 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=55.200663128 podStartE2EDuration="1m2.165026186s" podCreationTimestamp="2025-12-03 17:17:18 +0000 UTC" firstStartedPulling="2025-12-03 17:17:35.084028544 +0000 UTC m=+1049.471549271" lastFinishedPulling="2025-12-03 17:17:42.048391602 +0000 UTC m=+1056.435912329" observedRunningTime="2025-12-03 17:18:20.137092063 +0000 UTC m=+1094.524612800" watchObservedRunningTime="2025-12-03 17:18:20.165026186 +0000 UTC m=+1094.552546923" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.439593 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.526839 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l42fg\" (UniqueName: \"kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.526956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.526994 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527090 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527102 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527120 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run" (OuterVolumeSpecName: "var-run") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527233 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn\") pod \"658aa0e7-f197-4504-8cda-30ce35d96d7e\" (UID: \"658aa0e7-f197-4504-8cda-30ce35d96d7e\") " Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527316 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527808 4841 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527824 4841 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527837 4841 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/658aa0e7-f197-4504-8cda-30ce35d96d7e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.527948 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.528288 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts" (OuterVolumeSpecName: "scripts") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.532432 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg" (OuterVolumeSpecName: "kube-api-access-l42fg") pod "658aa0e7-f197-4504-8cda-30ce35d96d7e" (UID: "658aa0e7-f197-4504-8cda-30ce35d96d7e"). InnerVolumeSpecName "kube-api-access-l42fg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.629606 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.629648 4841 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/658aa0e7-f197-4504-8cda-30ce35d96d7e-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:21 crc kubenswrapper[4841]: I1203 17:18:21.629662 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l42fg\" (UniqueName: \"kubernetes.io/projected/658aa0e7-f197-4504-8cda-30ce35d96d7e-kube-api-access-l42fg\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:22 crc kubenswrapper[4841]: I1203 17:18:22.129141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-cqt22-config-8qvbm" event={"ID":"658aa0e7-f197-4504-8cda-30ce35d96d7e","Type":"ContainerDied","Data":"dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8"} Dec 03 17:18:22 crc kubenswrapper[4841]: I1203 17:18:22.129209 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcdf5f19d4665e45b921fd73997748de729b80be867bf52581d2148e5e1f0cc8" Dec 03 17:18:22 crc kubenswrapper[4841]: I1203 17:18:22.129615 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-cqt22-config-8qvbm" Dec 03 17:18:22 crc kubenswrapper[4841]: I1203 17:18:22.521129 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-cqt22-config-8qvbm"] Dec 03 17:18:22 crc kubenswrapper[4841]: I1203 17:18:22.528166 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-cqt22-config-8qvbm"] Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.141047 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-cqt22" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.184439 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-6mxtf"] Dec 03 17:18:23 crc kubenswrapper[4841]: E1203 17:18:23.184769 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="931a0b37-5c50-415b-ba38-71e3d2c7e632" containerName="mariadb-database-create" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.184784 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="931a0b37-5c50-415b-ba38-71e3d2c7e632" containerName="mariadb-database-create" Dec 03 17:18:23 crc kubenswrapper[4841]: E1203 17:18:23.184797 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="658aa0e7-f197-4504-8cda-30ce35d96d7e" containerName="ovn-config" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.184804 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="658aa0e7-f197-4504-8cda-30ce35d96d7e" containerName="ovn-config" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.184965 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="931a0b37-5c50-415b-ba38-71e3d2c7e632" containerName="mariadb-database-create" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.184975 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="658aa0e7-f197-4504-8cda-30ce35d96d7e" containerName="ovn-config" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.185448 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.187520 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vnmcl" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.187950 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.197479 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6mxtf"] Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.257670 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.257730 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m75k\" (UniqueName: \"kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.257757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.257860 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.359722 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.359853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.359885 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m75k\" (UniqueName: \"kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.359934 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.366918 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.375023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.376701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.387719 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m75k\" (UniqueName: \"kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k\") pod \"glance-db-sync-6mxtf\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:23 crc kubenswrapper[4841]: I1203 17:18:23.509528 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:24 crc kubenswrapper[4841]: I1203 17:18:24.089125 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-6mxtf"] Dec 03 17:18:24 crc kubenswrapper[4841]: W1203 17:18:24.090253 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddb235153_a06b_4f9b_9129_76d9e7d7b1e4.slice/crio-de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741 WatchSource:0}: Error finding container de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741: Status 404 returned error can't find the container with id de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741 Dec 03 17:18:24 crc kubenswrapper[4841]: I1203 17:18:24.148448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6mxtf" event={"ID":"db235153-a06b-4f9b-9129-76d9e7d7b1e4","Type":"ContainerStarted","Data":"de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741"} Dec 03 17:18:24 crc kubenswrapper[4841]: I1203 17:18:24.251003 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="658aa0e7-f197-4504-8cda-30ce35d96d7e" path="/var/lib/kubelet/pods/658aa0e7-f197-4504-8cda-30ce35d96d7e/volumes" Dec 03 17:18:27 crc kubenswrapper[4841]: I1203 17:18:27.448323 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:18:27 crc kubenswrapper[4841]: I1203 17:18:27.456633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/73a827c8-6b3c-4ffa-9c76-3d3591f38182-etc-swift\") pod \"swift-storage-0\" (UID: \"73a827c8-6b3c-4ffa-9c76-3d3591f38182\") " pod="openstack/swift-storage-0" Dec 03 17:18:27 crc kubenswrapper[4841]: I1203 17:18:27.652116 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 17:18:28 crc kubenswrapper[4841]: I1203 17:18:28.265417 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 17:18:28 crc kubenswrapper[4841]: W1203 17:18:28.268255 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73a827c8_6b3c_4ffa_9c76_3d3591f38182.slice/crio-ffac603f058c3f6017c69079f8d49248fb933c894f9644e27b486c87be11a809 WatchSource:0}: Error finding container ffac603f058c3f6017c69079f8d49248fb933c894f9644e27b486c87be11a809: Status 404 returned error can't find the container with id ffac603f058c3f6017c69079f8d49248fb933c894f9644e27b486c87be11a809 Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.192769 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"ffac603f058c3f6017c69079f8d49248fb933c894f9644e27b486c87be11a809"} Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.208680 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.605077 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.742338 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-2fp7p"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.750036 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.793650 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2fp7p"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.802565 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-576f-account-create-update-q8hwq"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.803569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.808930 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.810986 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-576f-account-create-update-q8hwq"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.819282 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-78vbd"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.820358 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.832838 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-78vbd"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.884879 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-c5l22"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.891472 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-c5l22" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.898827 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-c5l22"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914176 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914241 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfw7m\" (UniqueName: \"kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914268 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxhlh\" (UniqueName: \"kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksqz9\" (UniqueName: \"kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914345 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.914372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.947679 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cpmwr"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.949100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.955148 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.955290 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.955369 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qfbt" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.955535 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.976138 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cpmwr"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.984403 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-5830-account-create-update-2sv68"] Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.987504 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:29 crc kubenswrapper[4841]: I1203 17:18:29.994818 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.009344 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5830-account-create-update-2sv68"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016252 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfw7m\" (UniqueName: \"kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016305 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxhlh\" (UniqueName: \"kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016341 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksqz9\" (UniqueName: \"kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016389 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7cj\" (UniqueName: \"kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016517 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.016575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.017410 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.017658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.017764 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.039994 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxhlh\" (UniqueName: \"kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh\") pod \"cinder-db-create-2fp7p\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.050690 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfw7m\" (UniqueName: \"kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m\") pod \"barbican-576f-account-create-update-q8hwq\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.083143 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksqz9\" (UniqueName: \"kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9\") pod \"barbican-db-create-78vbd\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.084075 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-51cb-account-create-update-8zh96"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.085263 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.091719 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-51cb-account-create-update-8zh96"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.094116 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.094242 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118472 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kskn5\" (UniqueName: \"kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118553 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzq9g\" (UniqueName: \"kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118642 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118729 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7cj\" (UniqueName: \"kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.118944 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.119616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.119704 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.124570 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.135698 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7cj\" (UniqueName: \"kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj\") pod \"heat-db-create-c5l22\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.136285 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.171316 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sl79f"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.172459 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.189134 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sl79f"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.209941 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-c5l22" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221587 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vkgl\" (UniqueName: \"kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221694 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221752 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kskn5\" (UniqueName: \"kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221789 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzq9g\" (UniqueName: \"kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.221864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.222614 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.227427 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.235421 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.240787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kskn5\" (UniqueName: \"kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5\") pod \"heat-5830-account-create-update-2sv68\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.242404 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzq9g\" (UniqueName: \"kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g\") pod \"keystone-db-sync-cpmwr\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.271997 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.314242 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.323666 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vkgl\" (UniqueName: \"kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.323873 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.324065 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.324280 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddsq4\" (UniqueName: \"kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.325406 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.339573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vkgl\" (UniqueName: \"kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl\") pod \"cinder-51cb-account-create-update-8zh96\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.370476 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d8ff-account-create-update-zsbgm"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.371524 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.375861 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.399226 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d8ff-account-create-update-zsbgm"] Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.426871 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.427177 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddsq4\" (UniqueName: \"kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.427770 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.446620 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddsq4\" (UniqueName: \"kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4\") pod \"neutron-db-create-sl79f\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.475704 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.487296 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.528879 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk9kd\" (UniqueName: \"kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.528959 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.631794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk9kd\" (UniqueName: \"kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.631842 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.632585 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.647728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk9kd\" (UniqueName: \"kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd\") pod \"neutron-d8ff-account-create-update-zsbgm\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:30 crc kubenswrapper[4841]: I1203 17:18:30.699075 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.006638 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-2fp7p"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.150782 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-78vbd"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.163317 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sl79f"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.229695 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-c5l22"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.238714 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-576f-account-create-update-q8hwq"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.249365 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cpmwr"] Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.263145 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1fc94d86_48ee_4deb_9bf2_7606c6de4515.slice/crio-aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5 WatchSource:0}: Error finding container aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5: Status 404 returned error can't find the container with id aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5 Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.268920 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cd21556_0acb_47d3_8bd2_da1a675ac155.slice/crio-1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d WatchSource:0}: Error finding container 1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d: Status 404 returned error can't find the container with id 1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.277197 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod336897e4_2c50_4739_b719_db8fa6b2389d.slice/crio-a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4 WatchSource:0}: Error finding container a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4: Status 404 returned error can't find the container with id a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4 Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.346377 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-5830-account-create-update-2sv68"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.355198 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-51cb-account-create-update-8zh96"] Dec 03 17:18:38 crc kubenswrapper[4841]: I1203 17:18:38.361772 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d8ff-account-create-update-zsbgm"] Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.365416 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc54c21eb_8621_4959_a9be_de2efd6d1bb0.slice/crio-6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2 WatchSource:0}: Error finding container 6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2: Status 404 returned error can't find the container with id 6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2 Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.369485 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b308ac2_d80a_4b8e_9c3b_065c794f6a00.slice/crio-d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a WatchSource:0}: Error finding container d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a: Status 404 returned error can't find the container with id d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a Dec 03 17:18:38 crc kubenswrapper[4841]: W1203 17:18:38.375306 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f5d52e5_cdf1_4148_8481_8e8fa5bda200.slice/crio-9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493 WatchSource:0}: Error finding container 9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493: Status 404 returned error can't find the container with id 9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.276826 4841 generic.go:334] "Generic (PLEG): container finished" podID="7b308ac2-d80a-4b8e-9c3b-065c794f6a00" containerID="1bf8594f901a1a02da5e7767f236f191c820e2f1710a10a617f523c31dc52d84" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.276924 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-51cb-account-create-update-8zh96" event={"ID":"7b308ac2-d80a-4b8e-9c3b-065c794f6a00","Type":"ContainerDied","Data":"1bf8594f901a1a02da5e7767f236f191c820e2f1710a10a617f523c31dc52d84"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.277416 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-51cb-account-create-update-8zh96" event={"ID":"7b308ac2-d80a-4b8e-9c3b-065c794f6a00","Type":"ContainerStarted","Data":"d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.279361 4841 generic.go:334] "Generic (PLEG): container finished" podID="1fc94d86-48ee-4deb-9bf2-7606c6de4515" containerID="cf21ee40cdc8d44dfe634a4873c13fa1b22953016a770dd55ac1a379e6c3804d" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.279472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2fp7p" event={"ID":"1fc94d86-48ee-4deb-9bf2-7606c6de4515","Type":"ContainerDied","Data":"cf21ee40cdc8d44dfe634a4873c13fa1b22953016a770dd55ac1a379e6c3804d"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.279493 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2fp7p" event={"ID":"1fc94d86-48ee-4deb-9bf2-7606c6de4515","Type":"ContainerStarted","Data":"aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.281491 4841 generic.go:334] "Generic (PLEG): container finished" podID="5f5d52e5-cdf1-4148-8481-8e8fa5bda200" containerID="b3059769bb5b094d93a59edee97b8c8e70d3b31d09cab6f378f9d3d21beee1c3" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.281547 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d8ff-account-create-update-zsbgm" event={"ID":"5f5d52e5-cdf1-4148-8481-8e8fa5bda200","Type":"ContainerDied","Data":"b3059769bb5b094d93a59edee97b8c8e70d3b31d09cab6f378f9d3d21beee1c3"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.281569 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d8ff-account-create-update-zsbgm" event={"ID":"5f5d52e5-cdf1-4148-8481-8e8fa5bda200","Type":"ContainerStarted","Data":"9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.283107 4841 generic.go:334] "Generic (PLEG): container finished" podID="18a0cfe2-d206-4a46-b6ff-08a332049b44" containerID="79832e253ef65b2abcf988bd7233b615a262a974cc3bd7deb32227f0165a0dbc" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.283148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-576f-account-create-update-q8hwq" event={"ID":"18a0cfe2-d206-4a46-b6ff-08a332049b44","Type":"ContainerDied","Data":"79832e253ef65b2abcf988bd7233b615a262a974cc3bd7deb32227f0165a0dbc"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.283204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-576f-account-create-update-q8hwq" event={"ID":"18a0cfe2-d206-4a46-b6ff-08a332049b44","Type":"ContainerStarted","Data":"5db94b7301490e5c623fc112d0914d73794779b770bd0f8cd4970daaadd71df9"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.285472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"a3906e4629bdd54d7109c24684872f6d70ef9a8d1ae4962c2774adac2d8965fb"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.285506 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"e8b3009ce4d7bc3c15e8cde5e466ac8e3147974bc83c479e069d46e3bf69ee2f"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.287332 4841 generic.go:334] "Generic (PLEG): container finished" podID="0cd21556-0acb-47d3-8bd2-da1a675ac155" containerID="2cf5426404531e12a46ee3a8e44d387c74a074af0054d3b84a14a5f6be7c6c34" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.287401 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sl79f" event={"ID":"0cd21556-0acb-47d3-8bd2-da1a675ac155","Type":"ContainerDied","Data":"2cf5426404531e12a46ee3a8e44d387c74a074af0054d3b84a14a5f6be7c6c34"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.287494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sl79f" event={"ID":"0cd21556-0acb-47d3-8bd2-da1a675ac155","Type":"ContainerStarted","Data":"1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.289139 4841 generic.go:334] "Generic (PLEG): container finished" podID="b2aa5656-d3cf-43de-af5a-a9ba522ede37" containerID="a70c177d1c91b8ef1be80f0994ac50afea348a143d80a2dc27df8c4536413465" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.289202 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78vbd" event={"ID":"b2aa5656-d3cf-43de-af5a-a9ba522ede37","Type":"ContainerDied","Data":"a70c177d1c91b8ef1be80f0994ac50afea348a143d80a2dc27df8c4536413465"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.289224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78vbd" event={"ID":"b2aa5656-d3cf-43de-af5a-a9ba522ede37","Type":"ContainerStarted","Data":"c7e98689c2705274c70f84576c7bc1a9c15a0efee375c7d94ba711f3e5fc8962"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.290378 4841 generic.go:334] "Generic (PLEG): container finished" podID="c54c21eb-8621-4959-a9be-de2efd6d1bb0" containerID="a6924621b7a8a58ce8137d6ea9a099a6dfb362785e7fc3124c567f3822756d36" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.290427 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5830-account-create-update-2sv68" event={"ID":"c54c21eb-8621-4959-a9be-de2efd6d1bb0","Type":"ContainerDied","Data":"a6924621b7a8a58ce8137d6ea9a099a6dfb362785e7fc3124c567f3822756d36"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.290454 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5830-account-create-update-2sv68" event={"ID":"c54c21eb-8621-4959-a9be-de2efd6d1bb0","Type":"ContainerStarted","Data":"6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.298675 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cpmwr" event={"ID":"336897e4-2c50-4739-b719-db8fa6b2389d","Type":"ContainerStarted","Data":"a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.300174 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6mxtf" event={"ID":"db235153-a06b-4f9b-9129-76d9e7d7b1e4","Type":"ContainerStarted","Data":"05436975c6cd7a79998e170cfe416d49688d2f7bf8c0e75e3a5abab4ff9239b8"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.302120 4841 generic.go:334] "Generic (PLEG): container finished" podID="9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" containerID="deb21a925338c7a6c78babba12d8feb024209e4c35d880ffd2ddfcacbfff810c" exitCode=0 Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.302169 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-c5l22" event={"ID":"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35","Type":"ContainerDied","Data":"deb21a925338c7a6c78babba12d8feb024209e4c35d880ffd2ddfcacbfff810c"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.302187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-c5l22" event={"ID":"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35","Type":"ContainerStarted","Data":"3666e5df406a8298fb8bafbf5376cc54e409f2fa52988aa5a1210744875e9d9d"} Dec 03 17:18:39 crc kubenswrapper[4841]: I1203 17:18:39.393420 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-6mxtf" podStartSLOduration=2.945399574 podStartE2EDuration="16.393400889s" podCreationTimestamp="2025-12-03 17:18:23 +0000 UTC" firstStartedPulling="2025-12-03 17:18:24.091881381 +0000 UTC m=+1098.479402118" lastFinishedPulling="2025-12-03 17:18:37.539882676 +0000 UTC m=+1111.927403433" observedRunningTime="2025-12-03 17:18:39.388678529 +0000 UTC m=+1113.776199256" watchObservedRunningTime="2025-12-03 17:18:39.393400889 +0000 UTC m=+1113.780921616" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.314443 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"2e80c76742baf89f1979b9160635ec6c87172f36213d50be5aa1560db79342d4"} Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.710638 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.830476 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfw7m\" (UniqueName: \"kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m\") pod \"18a0cfe2-d206-4a46-b6ff-08a332049b44\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.830662 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts\") pod \"18a0cfe2-d206-4a46-b6ff-08a332049b44\" (UID: \"18a0cfe2-d206-4a46-b6ff-08a332049b44\") " Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.831588 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "18a0cfe2-d206-4a46-b6ff-08a332049b44" (UID: "18a0cfe2-d206-4a46-b6ff-08a332049b44"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.838466 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m" (OuterVolumeSpecName: "kube-api-access-sfw7m") pod "18a0cfe2-d206-4a46-b6ff-08a332049b44" (UID: "18a0cfe2-d206-4a46-b6ff-08a332049b44"). InnerVolumeSpecName "kube-api-access-sfw7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.912727 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.919056 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-c5l22" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.932663 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/18a0cfe2-d206-4a46-b6ff-08a332049b44-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.932702 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfw7m\" (UniqueName: \"kubernetes.io/projected/18a0cfe2-d206-4a46-b6ff-08a332049b44-kube-api-access-sfw7m\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.943451 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.944154 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.960661 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.971082 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:40 crc kubenswrapper[4841]: I1203 17:18:40.978954 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.033716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts\") pod \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.033835 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7cj\" (UniqueName: \"kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj\") pod \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\" (UID: \"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.033870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts\") pod \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.033948 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk9kd\" (UniqueName: \"kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd\") pod \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.033977 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksqz9\" (UniqueName: \"kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9\") pod \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034017 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts\") pod \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\" (UID: \"b2aa5656-d3cf-43de-af5a-a9ba522ede37\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034045 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts\") pod \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\" (UID: \"5f5d52e5-cdf1-4148-8481-8e8fa5bda200\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vkgl\" (UniqueName: \"kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl\") pod \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\" (UID: \"7b308ac2-d80a-4b8e-9c3b-065c794f6a00\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034314 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" (UID: "9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034674 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.034949 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2aa5656-d3cf-43de-af5a-a9ba522ede37" (UID: "b2aa5656-d3cf-43de-af5a-a9ba522ede37"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.035281 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b308ac2-d80a-4b8e-9c3b-065c794f6a00" (UID: "7b308ac2-d80a-4b8e-9c3b-065c794f6a00"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.035324 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5f5d52e5-cdf1-4148-8481-8e8fa5bda200" (UID: "5f5d52e5-cdf1-4148-8481-8e8fa5bda200"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.038314 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj" (OuterVolumeSpecName: "kube-api-access-8d7cj") pod "9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" (UID: "9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35"). InnerVolumeSpecName "kube-api-access-8d7cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.040592 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl" (OuterVolumeSpecName: "kube-api-access-2vkgl") pod "7b308ac2-d80a-4b8e-9c3b-065c794f6a00" (UID: "7b308ac2-d80a-4b8e-9c3b-065c794f6a00"). InnerVolumeSpecName "kube-api-access-2vkgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.043623 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9" (OuterVolumeSpecName: "kube-api-access-ksqz9") pod "b2aa5656-d3cf-43de-af5a-a9ba522ede37" (UID: "b2aa5656-d3cf-43de-af5a-a9ba522ede37"). InnerVolumeSpecName "kube-api-access-ksqz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.044860 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd" (OuterVolumeSpecName: "kube-api-access-tk9kd") pod "5f5d52e5-cdf1-4148-8481-8e8fa5bda200" (UID: "5f5d52e5-cdf1-4148-8481-8e8fa5bda200"). InnerVolumeSpecName "kube-api-access-tk9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136071 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts\") pod \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136178 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts\") pod \"0cd21556-0acb-47d3-8bd2-da1a675ac155\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxhlh\" (UniqueName: \"kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh\") pod \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\" (UID: \"1fc94d86-48ee-4deb-9bf2-7606c6de4515\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136315 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddsq4\" (UniqueName: \"kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4\") pod \"0cd21556-0acb-47d3-8bd2-da1a675ac155\" (UID: \"0cd21556-0acb-47d3-8bd2-da1a675ac155\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136362 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts\") pod \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136457 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kskn5\" (UniqueName: \"kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5\") pod \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\" (UID: \"c54c21eb-8621-4959-a9be-de2efd6d1bb0\") " Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136567 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1fc94d86-48ee-4deb-9bf2-7606c6de4515" (UID: "1fc94d86-48ee-4deb-9bf2-7606c6de4515"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136588 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0cd21556-0acb-47d3-8bd2-da1a675ac155" (UID: "0cd21556-0acb-47d3-8bd2-da1a675ac155"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136866 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136890 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0cd21556-0acb-47d3-8bd2-da1a675ac155-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136920 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vkgl\" (UniqueName: \"kubernetes.io/projected/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-kube-api-access-2vkgl\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136937 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7cj\" (UniqueName: \"kubernetes.io/projected/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35-kube-api-access-8d7cj\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136949 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b308ac2-d80a-4b8e-9c3b-065c794f6a00-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136960 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk9kd\" (UniqueName: \"kubernetes.io/projected/5f5d52e5-cdf1-4148-8481-8e8fa5bda200-kube-api-access-tk9kd\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136972 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksqz9\" (UniqueName: \"kubernetes.io/projected/b2aa5656-d3cf-43de-af5a-a9ba522ede37-kube-api-access-ksqz9\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136982 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1fc94d86-48ee-4deb-9bf2-7606c6de4515-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.136996 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2aa5656-d3cf-43de-af5a-a9ba522ede37-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.137021 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c54c21eb-8621-4959-a9be-de2efd6d1bb0" (UID: "c54c21eb-8621-4959-a9be-de2efd6d1bb0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.139000 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5" (OuterVolumeSpecName: "kube-api-access-kskn5") pod "c54c21eb-8621-4959-a9be-de2efd6d1bb0" (UID: "c54c21eb-8621-4959-a9be-de2efd6d1bb0"). InnerVolumeSpecName "kube-api-access-kskn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.139063 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4" (OuterVolumeSpecName: "kube-api-access-ddsq4") pod "0cd21556-0acb-47d3-8bd2-da1a675ac155" (UID: "0cd21556-0acb-47d3-8bd2-da1a675ac155"). InnerVolumeSpecName "kube-api-access-ddsq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.140003 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh" (OuterVolumeSpecName: "kube-api-access-qxhlh") pod "1fc94d86-48ee-4deb-9bf2-7606c6de4515" (UID: "1fc94d86-48ee-4deb-9bf2-7606c6de4515"). InnerVolumeSpecName "kube-api-access-qxhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.238828 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kskn5\" (UniqueName: \"kubernetes.io/projected/c54c21eb-8621-4959-a9be-de2efd6d1bb0-kube-api-access-kskn5\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.238978 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxhlh\" (UniqueName: \"kubernetes.io/projected/1fc94d86-48ee-4deb-9bf2-7606c6de4515-kube-api-access-qxhlh\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.238993 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddsq4\" (UniqueName: \"kubernetes.io/projected/0cd21556-0acb-47d3-8bd2-da1a675ac155-kube-api-access-ddsq4\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.239006 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c54c21eb-8621-4959-a9be-de2efd6d1bb0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.324975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-78vbd" event={"ID":"b2aa5656-d3cf-43de-af5a-a9ba522ede37","Type":"ContainerDied","Data":"c7e98689c2705274c70f84576c7bc1a9c15a0efee375c7d94ba711f3e5fc8962"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.325035 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7e98689c2705274c70f84576c7bc1a9c15a0efee375c7d94ba711f3e5fc8962" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.325114 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-78vbd" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.335597 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-51cb-account-create-update-8zh96" event={"ID":"7b308ac2-d80a-4b8e-9c3b-065c794f6a00","Type":"ContainerDied","Data":"d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.335694 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4159a484e4db246cdb8c33a0f277314feb9d13de7997276af995d4f6fc8296a" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.335827 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-51cb-account-create-update-8zh96" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.345312 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"234e15c54d8c24280d0838b7ace80100263c56747310cb42cb2d1ecb0d694ab6"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.347769 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sl79f" event={"ID":"0cd21556-0acb-47d3-8bd2-da1a675ac155","Type":"ContainerDied","Data":"1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.347809 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0135190a70706478f9283e30e292c03bf361cf3d23f5780f65f6e4c657972d" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.347883 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sl79f" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.350129 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-c5l22" event={"ID":"9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35","Type":"ContainerDied","Data":"3666e5df406a8298fb8bafbf5376cc54e409f2fa52988aa5a1210744875e9d9d"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.350180 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3666e5df406a8298fb8bafbf5376cc54e409f2fa52988aa5a1210744875e9d9d" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.350275 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-c5l22" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.361760 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-2fp7p" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.361758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-2fp7p" event={"ID":"1fc94d86-48ee-4deb-9bf2-7606c6de4515","Type":"ContainerDied","Data":"aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.361876 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa02bb56919237421bfd79f35818e74c2620ae031a73ccfe8478919757ad84c5" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.363595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d8ff-account-create-update-zsbgm" event={"ID":"5f5d52e5-cdf1-4148-8481-8e8fa5bda200","Type":"ContainerDied","Data":"9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.363625 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c68f76c37ec558cee9e27cb664cbcd31991254aaeef52a50fd958fe072c9493" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.363666 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d8ff-account-create-update-zsbgm" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.366892 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-576f-account-create-update-q8hwq" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.366926 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-576f-account-create-update-q8hwq" event={"ID":"18a0cfe2-d206-4a46-b6ff-08a332049b44","Type":"ContainerDied","Data":"5db94b7301490e5c623fc112d0914d73794779b770bd0f8cd4970daaadd71df9"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.366969 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db94b7301490e5c623fc112d0914d73794779b770bd0f8cd4970daaadd71df9" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.368816 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-5830-account-create-update-2sv68" event={"ID":"c54c21eb-8621-4959-a9be-de2efd6d1bb0","Type":"ContainerDied","Data":"6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2"} Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.368857 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-5830-account-create-update-2sv68" Dec 03 17:18:41 crc kubenswrapper[4841]: I1203 17:18:41.368869 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c662ca72bf6470354265916dc4cb2fc777892dbb74c57bea4675af1ab2f22b2" Dec 03 17:18:47 crc kubenswrapper[4841]: I1203 17:18:47.426655 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cpmwr" event={"ID":"336897e4-2c50-4739-b719-db8fa6b2389d","Type":"ContainerStarted","Data":"4881bbc5a13872a28cf19465ff6b462d195ad5e10d9c3c24f2fa0bb853686334"} Dec 03 17:18:47 crc kubenswrapper[4841]: I1203 17:18:47.443812 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cpmwr" podStartSLOduration=9.990182014 podStartE2EDuration="18.443795871s" podCreationTimestamp="2025-12-03 17:18:29 +0000 UTC" firstStartedPulling="2025-12-03 17:18:38.315581096 +0000 UTC m=+1112.703101833" lastFinishedPulling="2025-12-03 17:18:46.769194963 +0000 UTC m=+1121.156715690" observedRunningTime="2025-12-03 17:18:47.439574043 +0000 UTC m=+1121.827094770" watchObservedRunningTime="2025-12-03 17:18:47.443795871 +0000 UTC m=+1121.831316598" Dec 03 17:18:48 crc kubenswrapper[4841]: I1203 17:18:48.438268 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"368c1b6ab4992acadcdbba15f0bdab240b568bc09adcf2d4a29b85460433c712"} Dec 03 17:18:48 crc kubenswrapper[4841]: I1203 17:18:48.438691 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"2659cd09f47db0f82bcb58672bdaf19a79b5991f84b8b2ed8add800c08aeb43d"} Dec 03 17:18:48 crc kubenswrapper[4841]: I1203 17:18:48.438713 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"8ea7ed68f2722d4794e8f4b515dc7c0a50fe336b56590bc66852675e8b0beb46"} Dec 03 17:18:48 crc kubenswrapper[4841]: I1203 17:18:48.438733 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"663c3c402ee94e47c21a937b7837687f2445cf0538588e9a9e3edd0043d0de9c"} Dec 03 17:18:49 crc kubenswrapper[4841]: I1203 17:18:49.450988 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"e599c696b9ec696c3d9f2f43c4a5b1bf1fc3c92b1ee10ec25cd607f08a0e5122"} Dec 03 17:18:49 crc kubenswrapper[4841]: I1203 17:18:49.452809 4841 generic.go:334] "Generic (PLEG): container finished" podID="db235153-a06b-4f9b-9129-76d9e7d7b1e4" containerID="05436975c6cd7a79998e170cfe416d49688d2f7bf8c0e75e3a5abab4ff9239b8" exitCode=0 Dec 03 17:18:49 crc kubenswrapper[4841]: I1203 17:18:49.452885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6mxtf" event={"ID":"db235153-a06b-4f9b-9129-76d9e7d7b1e4","Type":"ContainerDied","Data":"05436975c6cd7a79998e170cfe416d49688d2f7bf8c0e75e3a5abab4ff9239b8"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.467257 4841 generic.go:334] "Generic (PLEG): container finished" podID="336897e4-2c50-4739-b719-db8fa6b2389d" containerID="4881bbc5a13872a28cf19465ff6b462d195ad5e10d9c3c24f2fa0bb853686334" exitCode=0 Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.467366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cpmwr" event={"ID":"336897e4-2c50-4739-b719-db8fa6b2389d","Type":"ContainerDied","Data":"4881bbc5a13872a28cf19465ff6b462d195ad5e10d9c3c24f2fa0bb853686334"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.489334 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"90650d914d184b9e63e0ce62cfe8bdc1540dc1c6faa476dcddc9d0a3392536ef"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.489419 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"34c66ea5a77ab755ce0c9cad59b399769a8955bc8720375c3b3b9ea9170512a8"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.489440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"f80258d74955e472ef78e89dacb4065ec299e06d9cd1ebfdf0bb66528d81b54e"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.489458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"c013e29c29d4c30335ab32e25dc5440695104793550fc5c5989b03516307697c"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.489476 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"84df1debfffb7ecb1771b4b9de7e224706eeb29ae22f98ce6466bed513b111ec"} Dec 03 17:18:50 crc kubenswrapper[4841]: I1203 17:18:50.933021 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.121220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle\") pod \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.121540 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data\") pod \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.121564 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m75k\" (UniqueName: \"kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k\") pod \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.121628 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data\") pod \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\" (UID: \"db235153-a06b-4f9b-9129-76d9e7d7b1e4\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.126224 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "db235153-a06b-4f9b-9129-76d9e7d7b1e4" (UID: "db235153-a06b-4f9b-9129-76d9e7d7b1e4"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.126699 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k" (OuterVolumeSpecName: "kube-api-access-4m75k") pod "db235153-a06b-4f9b-9129-76d9e7d7b1e4" (UID: "db235153-a06b-4f9b-9129-76d9e7d7b1e4"). InnerVolumeSpecName "kube-api-access-4m75k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.143274 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db235153-a06b-4f9b-9129-76d9e7d7b1e4" (UID: "db235153-a06b-4f9b-9129-76d9e7d7b1e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.165130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data" (OuterVolumeSpecName: "config-data") pod "db235153-a06b-4f9b-9129-76d9e7d7b1e4" (UID: "db235153-a06b-4f9b-9129-76d9e7d7b1e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.223623 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m75k\" (UniqueName: \"kubernetes.io/projected/db235153-a06b-4f9b-9129-76d9e7d7b1e4-kube-api-access-4m75k\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.223677 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.223697 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.223713 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/db235153-a06b-4f9b-9129-76d9e7d7b1e4-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.500511 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-6mxtf" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.500536 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-6mxtf" event={"ID":"db235153-a06b-4f9b-9129-76d9e7d7b1e4","Type":"ContainerDied","Data":"de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741"} Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.501564 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de9e1bfa13cd958ae5c8c158b656e99cd5529871f4e9cc79d85dc158f030c741" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.524337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"73a827c8-6b3c-4ffa-9c76-3d3591f38182","Type":"ContainerStarted","Data":"23b80f9ea09aab4659dd2cfd21edb8d5e03b6dbc210c0618a16de43d777eafe3"} Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.568312 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=36.689277424 podStartE2EDuration="57.568289202s" podCreationTimestamp="2025-12-03 17:17:54 +0000 UTC" firstStartedPulling="2025-12-03 17:18:28.270545446 +0000 UTC m=+1102.658066193" lastFinishedPulling="2025-12-03 17:18:49.149557214 +0000 UTC m=+1123.537077971" observedRunningTime="2025-12-03 17:18:51.555854961 +0000 UTC m=+1125.943375708" watchObservedRunningTime="2025-12-03 17:18:51.568289202 +0000 UTC m=+1125.955809939" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.847563 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.919837 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-cpklm"] Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920802 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54c21eb-8621-4959-a9be-de2efd6d1bb0" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920818 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54c21eb-8621-4959-a9be-de2efd6d1bb0" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920847 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5d52e5-cdf1-4148-8481-8e8fa5bda200" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920856 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5d52e5-cdf1-4148-8481-8e8fa5bda200" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920867 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a0cfe2-d206-4a46-b6ff-08a332049b44" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920874 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a0cfe2-d206-4a46-b6ff-08a332049b44" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920886 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2aa5656-d3cf-43de-af5a-a9ba522ede37" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920892 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2aa5656-d3cf-43de-af5a-a9ba522ede37" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920921 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b308ac2-d80a-4b8e-9c3b-065c794f6a00" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920927 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b308ac2-d80a-4b8e-9c3b-065c794f6a00" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920947 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920952 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920962 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db235153-a06b-4f9b-9129-76d9e7d7b1e4" containerName="glance-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920972 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="db235153-a06b-4f9b-9129-76d9e7d7b1e4" containerName="glance-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.920987 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1fc94d86-48ee-4deb-9bf2-7606c6de4515" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.920994 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc94d86-48ee-4deb-9bf2-7606c6de4515" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.921006 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="336897e4-2c50-4739-b719-db8fa6b2389d" containerName="keystone-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921012 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="336897e4-2c50-4739-b719-db8fa6b2389d" containerName="keystone-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: E1203 17:18:51.921030 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cd21556-0acb-47d3-8bd2-da1a675ac155" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921038 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cd21556-0acb-47d3-8bd2-da1a675ac155" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921371 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54c21eb-8621-4959-a9be-de2efd6d1bb0" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921403 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cd21556-0acb-47d3-8bd2-da1a675ac155" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921421 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5d52e5-cdf1-4148-8481-8e8fa5bda200" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921434 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2aa5656-d3cf-43de-af5a-a9ba522ede37" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921454 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b308ac2-d80a-4b8e-9c3b-065c794f6a00" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921468 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921483 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1fc94d86-48ee-4deb-9bf2-7606c6de4515" containerName="mariadb-database-create" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921491 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a0cfe2-d206-4a46-b6ff-08a332049b44" containerName="mariadb-account-create-update" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921509 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="336897e4-2c50-4739-b719-db8fa6b2389d" containerName="keystone-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.921521 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="db235153-a06b-4f9b-9129-76d9e7d7b1e4" containerName="glance-db-sync" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.923071 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.934310 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.936469 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzq9g\" (UniqueName: \"kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g\") pod \"336897e4-2c50-4739-b719-db8fa6b2389d\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.936548 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data\") pod \"336897e4-2c50-4739-b719-db8fa6b2389d\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.936730 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle\") pod \"336897e4-2c50-4739-b719-db8fa6b2389d\" (UID: \"336897e4-2c50-4739-b719-db8fa6b2389d\") " Dec 03 17:18:51 crc kubenswrapper[4841]: I1203 17:18:51.997417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g" (OuterVolumeSpecName: "kube-api-access-mzq9g") pod "336897e4-2c50-4739-b719-db8fa6b2389d" (UID: "336897e4-2c50-4739-b719-db8fa6b2389d"). InnerVolumeSpecName "kube-api-access-mzq9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.020877 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-cpklm"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.046779 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "336897e4-2c50-4739-b719-db8fa6b2389d" (UID: "336897e4-2c50-4739-b719-db8fa6b2389d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.047554 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.047792 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.047968 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.048115 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.048293 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.048454 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9vn8\" (UniqueName: \"kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.048678 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.048776 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzq9g\" (UniqueName: \"kubernetes.io/projected/336897e4-2c50-4739-b719-db8fa6b2389d-kube-api-access-mzq9g\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.074335 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data" (OuterVolumeSpecName: "config-data") pod "336897e4-2c50-4739-b719-db8fa6b2389d" (UID: "336897e4-2c50-4739-b719-db8fa6b2389d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.081614 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-cpklm"] Dec 03 17:18:52 crc kubenswrapper[4841]: E1203 17:18:52.082177 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[config dns-svc dns-swift-storage-0 kube-api-access-f9vn8 ovsdbserver-nb ovsdbserver-sb], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/dnsmasq-dns-764c5664d7-cpklm" podUID="6523309a-11c0-423f-924e-dccd44f64f7a" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.123191 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.124481 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.147850 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150290 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9vn8\" (UniqueName: \"kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150448 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150468 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150508 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.150556 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/336897e4-2c50-4739-b719-db8fa6b2389d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.151376 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.152444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.154033 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.154580 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.154993 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.180879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9vn8\" (UniqueName: \"kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8\") pod \"dnsmasq-dns-764c5664d7-cpklm\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258620 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258722 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258753 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258787 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d74nw\" (UniqueName: \"kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.258811 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.359861 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.359943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d74nw\" (UniqueName: \"kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.359971 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.360031 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.360060 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.360109 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.360970 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.361940 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.362140 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.362564 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.363012 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.386957 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d74nw\" (UniqueName: \"kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw\") pod \"dnsmasq-dns-74f6bcbc87-ckkgq\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.448467 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.540546 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cpmwr" event={"ID":"336897e4-2c50-4739-b719-db8fa6b2389d","Type":"ContainerDied","Data":"a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4"} Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.540751 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7853d20cd5541575f74f3f82d8ccb61dd8f6cbaf9bd2f966d6e6a18c4a547f4" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.541568 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.541749 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cpmwr" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.604891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.757258 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.765779 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.765831 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.765870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9vn8\" (UniqueName: \"kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.765922 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.765974 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb\") pod \"6523309a-11c0-423f-924e-dccd44f64f7a\" (UID: \"6523309a-11c0-423f-924e-dccd44f64f7a\") " Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766593 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766657 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766766 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766785 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.766936 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.772419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config" (OuterVolumeSpecName: "config") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.779225 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8" (OuterVolumeSpecName: "kube-api-access-f9vn8") pod "6523309a-11c0-423f-924e-dccd44f64f7a" (UID: "6523309a-11c0-423f-924e-dccd44f64f7a"). InnerVolumeSpecName "kube-api-access-f9vn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.789968 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-pwcxw"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.791293 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.796794 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.796995 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.797170 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qfbt" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.797308 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.797418 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.797917 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pwcxw"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.814483 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.820954 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.828555 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.868156 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.868181 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9vn8\" (UniqueName: \"kubernetes.io/projected/6523309a-11c0-423f-924e-dccd44f64f7a-kube-api-access-f9vn8\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.868191 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.868200 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6523309a-11c0-423f-924e-dccd44f64f7a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.945974 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-drrfd"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.947176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.953364 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.953647 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-l5pst" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.962532 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-drrfd"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.968942 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.968994 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969136 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969179 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969231 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969345 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4frq\" (UniqueName: \"kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969395 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldb6x\" (UniqueName: \"kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969420 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969469 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.969564 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.973692 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-8dbcn"] Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.974820 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.980346 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.980467 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2pvqc" Dec 03 17:18:52 crc kubenswrapper[4841]: I1203 17:18:52.980594 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.018982 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8dbcn"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.047345 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.049361 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.055026 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.062273 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.062756 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071106 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071133 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071162 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071178 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071196 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071216 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071249 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071284 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4frq\" (UniqueName: \"kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071305 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8c7j5\" (UniqueName: \"kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071335 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldb6x\" (UniqueName: \"kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071355 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071405 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071426 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071446 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wm9mt\" (UniqueName: \"kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.071470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.072245 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.073671 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.074241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.074274 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-82zhd"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.074833 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.076643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.082805 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.085251 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.085517 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.089554 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.096481 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-82zhd"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.096646 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.096756 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cwpsc" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.096843 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.107954 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldb6x\" (UniqueName: \"kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x\") pod \"dnsmasq-dns-847c4cc679-skb6s\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.108797 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.120521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.135158 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.137869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4frq\" (UniqueName: \"kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq\") pod \"keystone-bootstrap-pwcxw\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.158940 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-gj6sq"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.160145 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.163406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-skp9h" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.163626 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.182969 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194403 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194451 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194474 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194499 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg7k6\" (UniqueName: \"kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194553 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8c7j5\" (UniqueName: \"kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194579 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52l7f\" (UniqueName: \"kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194672 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194700 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194731 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194758 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194806 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wm9mt\" (UniqueName: \"kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194854 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.194878 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.213320 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.214217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.221520 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.221589 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.248199 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wm9mt\" (UniqueName: \"kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt\") pod \"heat-db-sync-drrfd\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.269100 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-drrfd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.270643 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gj6sq"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.273356 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.286997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8c7j5\" (UniqueName: \"kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5\") pod \"neutron-db-sync-8dbcn\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.296604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.296816 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.296866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.296961 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52l7f\" (UniqueName: \"kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.296995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297035 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcrv\" (UniqueName: \"kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297066 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297269 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297396 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297505 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.297547 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg7k6\" (UniqueName: \"kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.318436 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.320449 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.326565 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.326720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.328018 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.332261 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.332396 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.332673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.335121 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.339139 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.339169 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg7k6\" (UniqueName: \"kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.362129 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.364463 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data\") pod \"cinder-db-sync-82zhd\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.364813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52l7f\" (UniqueName: \"kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.368740 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.373128 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.385436 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-gdwlx"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.386788 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.390064 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.390501 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.390746 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x78cp" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.399531 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.399797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcrv\" (UniqueName: \"kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.399824 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.403560 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.408125 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gdwlx"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.419568 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.420356 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-82zhd" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.421042 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.422891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.425957 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcrv\" (UniqueName: \"kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv\") pod \"barbican-db-sync-gj6sq\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.428133 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505251 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505296 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505399 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505420 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78sd\" (UniqueName: \"kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505439 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505464 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505484 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlq5m\" (UniqueName: \"kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505506 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.505561 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.520344 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.588148 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-cpklm" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.588144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" event={"ID":"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e","Type":"ContainerStarted","Data":"6a74f9903f9decec4aa6f7a3b563eb6cabbaafc8b9a560dd7f798a2fb1e4d0c2"} Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.608795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlq5m\" (UniqueName: \"kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.608879 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609267 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609473 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609639 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609686 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.609793 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.610357 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78sd\" (UniqueName: \"kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.610419 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.610462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.612740 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.612988 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.613624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.614404 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.616135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.622542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.624314 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.628388 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlq5m\" (UniqueName: \"kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m\") pod \"dnsmasq-dns-785d8bcb8c-2vksl\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.634314 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.642966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78sd\" (UniqueName: \"kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd\") pod \"placement-db-sync-gdwlx\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.662728 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-cpklm"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.668673 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-cpklm"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.763599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdwlx" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.788529 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.791613 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.896279 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.898193 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.899733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vnmcl" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.901022 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.901178 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.909492 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:18:53 crc kubenswrapper[4841]: I1203 17:18:53.969284 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-pwcxw"] Dec 03 17:18:53 crc kubenswrapper[4841]: W1203 17:18:53.995567 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6360034b_9d22_4426_91a5_85d646ea907b.slice/crio-54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799 WatchSource:0}: Error finding container 54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799: Status 404 returned error can't find the container with id 54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799 Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.014060 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.015515 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.020279 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.027999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzdmq\" (UniqueName: \"kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028070 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028094 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028113 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028160 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028178 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.028236 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.040088 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.072862 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-drrfd"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.088882 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-8dbcn"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.137726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.138720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.138832 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.139053 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzdmq\" (UniqueName: \"kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.140293 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.140498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.140619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.140987 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141085 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141497 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141599 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmvkx\" (UniqueName: \"kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.141689 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.143609 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.145993 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.146857 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.149163 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.149599 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.157259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.169523 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzdmq\" (UniqueName: \"kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.192205 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.219348 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243416 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243469 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmvkx\" (UniqueName: \"kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243552 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243625 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243728 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.243897 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.244425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.244836 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: W1203 17:18:54.249361 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd5216b6_064c_45a5_868c_816d362eced0.slice/crio-a27ec0622467c98ef63ccb403bfad0c174efda540736a571b2dcba6ab4e53bd1 WatchSource:0}: Error finding container a27ec0622467c98ef63ccb403bfad0c174efda540736a571b2dcba6ab4e53bd1: Status 404 returned error can't find the container with id a27ec0622467c98ef63ccb403bfad0c174efda540736a571b2dcba6ab4e53bd1 Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.258628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.264504 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.266676 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmvkx\" (UniqueName: \"kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.269696 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.279294 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6523309a-11c0-423f-924e-dccd44f64f7a" path="/var/lib/kubelet/pods/6523309a-11c0-423f-924e-dccd44f64f7a/volumes" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.293051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.293695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.356741 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.445637 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-82zhd"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.451130 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-gj6sq"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.577362 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-gdwlx"] Dec 03 17:18:54 crc kubenswrapper[4841]: W1203 17:18:54.579110 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2fec744_9e89_4330_88a5_f0e4c2173870.slice/crio-717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f WatchSource:0}: Error finding container 717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f: Status 404 returned error can't find the container with id 717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.593529 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.607093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerStarted","Data":"a27ec0622467c98ef63ccb403bfad0c174efda540736a571b2dcba6ab4e53bd1"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.608218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwcxw" event={"ID":"6360034b-9d22-4426-91a5-85d646ea907b","Type":"ContainerStarted","Data":"4124a709b9468d60618d0a00013f2e8302ae03033ce52df7b44443775fcb92f3"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.608246 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwcxw" event={"ID":"6360034b-9d22-4426-91a5-85d646ea907b","Type":"ContainerStarted","Data":"54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.610589 4841 generic.go:334] "Generic (PLEG): container finished" podID="2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" containerID="e00ca1059f34efcd74eff1fa24b79835db70101583bffc35d16dd1e67eeae7a5" exitCode=0 Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.610660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" event={"ID":"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e","Type":"ContainerDied","Data":"e00ca1059f34efcd74eff1fa24b79835db70101583bffc35d16dd1e67eeae7a5"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.616393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" event={"ID":"f982a994-2292-49fb-9b20-4f78c9730210","Type":"ContainerStarted","Data":"b78b85cef084a674494aaf17a47a4babd04a8f3296288cb39eb3129a91727fcd"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.618767 4841 generic.go:334] "Generic (PLEG): container finished" podID="98093b8f-6872-4081-a33a-6d74077a241e" containerID="326e94f6aa8d31ff58018dc45c3b32dccee2078f7a89fcb2ad3306cfa326a3dc" exitCode=0 Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.618818 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" event={"ID":"98093b8f-6872-4081-a33a-6d74077a241e","Type":"ContainerDied","Data":"326e94f6aa8d31ff58018dc45c3b32dccee2078f7a89fcb2ad3306cfa326a3dc"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.618840 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" event={"ID":"98093b8f-6872-4081-a33a-6d74077a241e","Type":"ContainerStarted","Data":"03ebe60058bfd71640af6e85914eecbfa987379b4bf8f3215acb94f9932e9a64"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.620617 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-drrfd" event={"ID":"6a7b606d-af6a-477c-9ac6-f93db645651d","Type":"ContainerStarted","Data":"f6910c71e7e18e9b0a7488a3fd6a2a419128c94e6bf97fdcc21f38ac41b6bd9b"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.622143 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdwlx" event={"ID":"b2fec744-9e89-4330-88a5-f0e4c2173870","Type":"ContainerStarted","Data":"717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.629462 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-pwcxw" podStartSLOduration=2.629445057 podStartE2EDuration="2.629445057s" podCreationTimestamp="2025-12-03 17:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:54.62831352 +0000 UTC m=+1129.015834247" watchObservedRunningTime="2025-12-03 17:18:54.629445057 +0000 UTC m=+1129.016965784" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.636872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-82zhd" event={"ID":"c094953c-fc36-4dda-9497-381f9ae48471","Type":"ContainerStarted","Data":"8cb206a9bb64000cd8d0871a84ddb75b2ed59f8e6d869a4013fca8ff3c227306"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.638674 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gj6sq" event={"ID":"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa","Type":"ContainerStarted","Data":"94e220eef109e4dd9375041dbff128f2443c05eea616de394c57fd65864317fd"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.641484 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8dbcn" event={"ID":"e6ce602c-aad4-4d9d-a924-a200b8d8658d","Type":"ContainerStarted","Data":"9de148a07dcb85b7edd57f5ca6a713f4e0997490566e3dab9624a50d096a551f"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.641530 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8dbcn" event={"ID":"e6ce602c-aad4-4d9d-a924-a200b8d8658d","Type":"ContainerStarted","Data":"01b11265860ef7d559eb6cc83dcf1c77fca03bfbfca440a6a20cd64b42c50193"} Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.704339 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-8dbcn" podStartSLOduration=2.704314015 podStartE2EDuration="2.704314015s" podCreationTimestamp="2025-12-03 17:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:54.689616702 +0000 UTC m=+1129.077137429" watchObservedRunningTime="2025-12-03 17:18:54.704314015 +0000 UTC m=+1129.091834742" Dec 03 17:18:54 crc kubenswrapper[4841]: I1203 17:18:54.955156 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.068144 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.165084 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.193734 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270197 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270497 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270583 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldb6x\" (UniqueName: \"kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270636 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.270659 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb\") pod \"98093b8f-6872-4081-a33a-6d74077a241e\" (UID: \"98093b8f-6872-4081-a33a-6d74077a241e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.272717 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.301495 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x" (OuterVolumeSpecName: "kube-api-access-ldb6x") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "kube-api-access-ldb6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.312037 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.316559 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.319899 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.329302 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.338393 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config" (OuterVolumeSpecName: "config") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.351138 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.378722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.379252 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.379479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.379552 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.379575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.379704 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d74nw\" (UniqueName: \"kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw\") pod \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\" (UID: \"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e\") " Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.380497 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.380513 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldb6x\" (UniqueName: \"kubernetes.io/projected/98093b8f-6872-4081-a33a-6d74077a241e-kube-api-access-ldb6x\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.380533 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.380541 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.380549 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.393645 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw" (OuterVolumeSpecName: "kube-api-access-d74nw") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "kube-api-access-d74nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.436515 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.444531 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config" (OuterVolumeSpecName: "config") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.448415 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.453984 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.457162 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" (UID: "2b51dd1f-cc31-4804-ae8f-3bc292d9be5e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.457273 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98093b8f-6872-4081-a33a-6d74077a241e" (UID: "98093b8f-6872-4081-a33a-6d74077a241e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482231 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482280 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482290 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482298 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482307 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d74nw\" (UniqueName: \"kubernetes.io/projected/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-kube-api-access-d74nw\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482315 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.482323 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98093b8f-6872-4081-a33a-6d74077a241e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.707314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerStarted","Data":"86de5dd59285295d3c5b9cf57cea0ef51075a2119999d31bf2f1168b427bb56d"} Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.710363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerStarted","Data":"af7e7554f878baf9e8f4979d8f337f6dfa9169ddb40d7ee6660932bc0867c0aa"} Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.714447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" event={"ID":"2b51dd1f-cc31-4804-ae8f-3bc292d9be5e","Type":"ContainerDied","Data":"6a74f9903f9decec4aa6f7a3b563eb6cabbaafc8b9a560dd7f798a2fb1e4d0c2"} Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.715307 4841 scope.go:117] "RemoveContainer" containerID="e00ca1059f34efcd74eff1fa24b79835db70101583bffc35d16dd1e67eeae7a5" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.715704 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-ckkgq" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.720090 4841 generic.go:334] "Generic (PLEG): container finished" podID="f982a994-2292-49fb-9b20-4f78c9730210" containerID="510aecc00db9e1df7bb526f80c2867276455332dad5125a58e7abc1cf9561e6d" exitCode=0 Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.720195 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" event={"ID":"f982a994-2292-49fb-9b20-4f78c9730210","Type":"ContainerDied","Data":"510aecc00db9e1df7bb526f80c2867276455332dad5125a58e7abc1cf9561e6d"} Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.727146 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.737657 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-skb6s" event={"ID":"98093b8f-6872-4081-a33a-6d74077a241e","Type":"ContainerDied","Data":"03ebe60058bfd71640af6e85914eecbfa987379b4bf8f3215acb94f9932e9a64"} Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.837451 4841 scope.go:117] "RemoveContainer" containerID="326e94f6aa8d31ff58018dc45c3b32dccee2078f7a89fcb2ad3306cfa326a3dc" Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.873078 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.895831 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-ckkgq"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.919688 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:55 crc kubenswrapper[4841]: I1203 17:18:55.937572 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-skb6s"] Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.296601 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" path="/var/lib/kubelet/pods/2b51dd1f-cc31-4804-ae8f-3bc292d9be5e/volumes" Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.297234 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98093b8f-6872-4081-a33a-6d74077a241e" path="/var/lib/kubelet/pods/98093b8f-6872-4081-a33a-6d74077a241e/volumes" Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.748590 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" event={"ID":"f982a994-2292-49fb-9b20-4f78c9730210","Type":"ContainerStarted","Data":"b8ceb4deb350fd244cdb0a09938369b3859bf9fcdbd40f3d9ca04d4c73082ec1"} Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.748935 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.755480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerStarted","Data":"1f98a46f9130f48503513e3982adb5d8453cda50deb785e715a2b280bd6b9e82"} Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.766866 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" podStartSLOduration=3.766847807 podStartE2EDuration="3.766847807s" podCreationTimestamp="2025-12-03 17:18:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:56.765103026 +0000 UTC m=+1131.152623763" watchObservedRunningTime="2025-12-03 17:18:56.766847807 +0000 UTC m=+1131.154368534" Dec 03 17:18:56 crc kubenswrapper[4841]: I1203 17:18:56.771857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerStarted","Data":"3430a262df7488cf7cadd3b02087a329c1e9e795e400ef525d1b62fe323132c5"} Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.787280 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerStarted","Data":"e401cc6123b4addc5abe3841f393df9ca9de27c031414d582e5fc372d9ac85aa"} Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.787392 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-log" containerID="cri-o://1f98a46f9130f48503513e3982adb5d8453cda50deb785e715a2b280bd6b9e82" gracePeriod=30 Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.787608 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-httpd" containerID="cri-o://e401cc6123b4addc5abe3841f393df9ca9de27c031414d582e5fc372d9ac85aa" gracePeriod=30 Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.795034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerStarted","Data":"e11a8dbc5f1be6ba2686e11670aa11966266c6314e9773ecdf136dcdd2c84ad9"} Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.795188 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-log" containerID="cri-o://3430a262df7488cf7cadd3b02087a329c1e9e795e400ef525d1b62fe323132c5" gracePeriod=30 Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.795203 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-httpd" containerID="cri-o://e11a8dbc5f1be6ba2686e11670aa11966266c6314e9773ecdf136dcdd2c84ad9" gracePeriod=30 Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.819427 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.819411289 podStartE2EDuration="5.819411289s" podCreationTimestamp="2025-12-03 17:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:57.811943545 +0000 UTC m=+1132.199464272" watchObservedRunningTime="2025-12-03 17:18:57.819411289 +0000 UTC m=+1132.206932006" Dec 03 17:18:57 crc kubenswrapper[4841]: I1203 17:18:57.852930 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.852882861 podStartE2EDuration="5.852882861s" podCreationTimestamp="2025-12-03 17:18:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:18:57.841054264 +0000 UTC m=+1132.228574991" watchObservedRunningTime="2025-12-03 17:18:57.852882861 +0000 UTC m=+1132.240403588" Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.819212 4841 generic.go:334] "Generic (PLEG): container finished" podID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerID="e401cc6123b4addc5abe3841f393df9ca9de27c031414d582e5fc372d9ac85aa" exitCode=0 Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.819246 4841 generic.go:334] "Generic (PLEG): container finished" podID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerID="1f98a46f9130f48503513e3982adb5d8453cda50deb785e715a2b280bd6b9e82" exitCode=143 Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.819297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerDied","Data":"e401cc6123b4addc5abe3841f393df9ca9de27c031414d582e5fc372d9ac85aa"} Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.819324 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerDied","Data":"1f98a46f9130f48503513e3982adb5d8453cda50deb785e715a2b280bd6b9e82"} Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.821806 4841 generic.go:334] "Generic (PLEG): container finished" podID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerID="e11a8dbc5f1be6ba2686e11670aa11966266c6314e9773ecdf136dcdd2c84ad9" exitCode=0 Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.821835 4841 generic.go:334] "Generic (PLEG): container finished" podID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerID="3430a262df7488cf7cadd3b02087a329c1e9e795e400ef525d1b62fe323132c5" exitCode=143 Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.821888 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerDied","Data":"e11a8dbc5f1be6ba2686e11670aa11966266c6314e9773ecdf136dcdd2c84ad9"} Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.821955 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerDied","Data":"3430a262df7488cf7cadd3b02087a329c1e9e795e400ef525d1b62fe323132c5"} Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.823648 4841 generic.go:334] "Generic (PLEG): container finished" podID="6360034b-9d22-4426-91a5-85d646ea907b" containerID="4124a709b9468d60618d0a00013f2e8302ae03033ce52df7b44443775fcb92f3" exitCode=0 Dec 03 17:18:58 crc kubenswrapper[4841]: I1203 17:18:58.823682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwcxw" event={"ID":"6360034b-9d22-4426-91a5-85d646ea907b","Type":"ContainerDied","Data":"4124a709b9468d60618d0a00013f2e8302ae03033ce52df7b44443775fcb92f3"} Dec 03 17:19:03 crc kubenswrapper[4841]: I1203 17:19:03.794091 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:19:03 crc kubenswrapper[4841]: I1203 17:19:03.851848 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:19:03 crc kubenswrapper[4841]: I1203 17:19:03.852326 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-znrqh" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" containerID="cri-o://f7c37848c1193c51c133fd068f98e947890b68d25d11f0a02566abbca90f818a" gracePeriod=10 Dec 03 17:19:04 crc kubenswrapper[4841]: I1203 17:19:04.857673 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-znrqh" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: connect: connection refused" Dec 03 17:19:04 crc kubenswrapper[4841]: I1203 17:19:04.907764 4841 generic.go:334] "Generic (PLEG): container finished" podID="0d1d957c-03f0-472f-888f-f410cb214bba" containerID="f7c37848c1193c51c133fd068f98e947890b68d25d11f0a02566abbca90f818a" exitCode=0 Dec 03 17:19:04 crc kubenswrapper[4841]: I1203 17:19:04.907813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-znrqh" event={"ID":"0d1d957c-03f0-472f-888f-f410cb214bba","Type":"ContainerDied","Data":"f7c37848c1193c51c133fd068f98e947890b68d25d11f0a02566abbca90f818a"} Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.215673 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzdmq\" (UniqueName: \"kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375666 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375840 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.375975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.376047 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs\") pod \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\" (UID: \"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec\") " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.376599 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs" (OuterVolumeSpecName: "logs") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.376819 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.377957 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.378076 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.382656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts" (OuterVolumeSpecName: "scripts") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.384164 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq" (OuterVolumeSpecName: "kube-api-access-vzdmq") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "kube-api-access-vzdmq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.397374 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.403872 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.424191 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data" (OuterVolumeSpecName: "config-data") pod "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" (UID: "3df7b49f-8dc8-4c98-ad9a-faed6b2439ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.479625 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.479672 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzdmq\" (UniqueName: \"kubernetes.io/projected/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-kube-api-access-vzdmq\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.479688 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.479701 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.479742 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.505678 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.581760 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.949474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3df7b49f-8dc8-4c98-ad9a-faed6b2439ec","Type":"ContainerDied","Data":"86de5dd59285295d3c5b9cf57cea0ef51075a2119999d31bf2f1168b427bb56d"} Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.949527 4841 scope.go:117] "RemoveContainer" containerID="e401cc6123b4addc5abe3841f393df9ca9de27c031414d582e5fc372d9ac85aa" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.949642 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:19:08 crc kubenswrapper[4841]: I1203 17:19:08.992423 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.011132 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.022751 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.023389 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98093b8f-6872-4081-a33a-6d74077a241e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023420 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="98093b8f-6872-4081-a33a-6d74077a241e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.023434 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-httpd" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023446 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-httpd" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.023472 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-log" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023483 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-log" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.023502 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023512 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023817 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b51dd1f-cc31-4804-ae8f-3bc292d9be5e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023840 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-httpd" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023880 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" containerName="glance-log" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.023893 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="98093b8f-6872-4081-a33a-6d74077a241e" containerName="init" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.025158 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.030796 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.032168 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.032654 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193415 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193519 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193567 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193668 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzzkc\" (UniqueName: \"kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193689 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193706 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.193726 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294890 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294931 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzzkc\" (UniqueName: \"kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294972 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.294993 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.295018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.295083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.295480 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.295610 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.295824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.299020 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.301464 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.301485 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.310270 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.314050 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzzkc\" (UniqueName: \"kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.316884 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.317063 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.320806 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.348679 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.587100 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.587266 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xdcrv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-gj6sq_openstack(084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.589697 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-gj6sq" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.694896 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.702506 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803128 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803366 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803528 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803596 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803686 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803744 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803837 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4frq\" (UniqueName: \"kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.803953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.804008 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmvkx\" (UniqueName: \"kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx\") pod \"1e23266e-1b86-4f4f-9a54-c6082fd38384\" (UID: \"1e23266e-1b86-4f4f-9a54-c6082fd38384\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.804030 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.804090 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data\") pod \"6360034b-9d22-4426-91a5-85d646ea907b\" (UID: \"6360034b-9d22-4426-91a5-85d646ea907b\") " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.804302 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs" (OuterVolumeSpecName: "logs") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.804506 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.805232 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.805253 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1e23266e-1b86-4f4f-9a54-c6082fd38384-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.808409 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts" (OuterVolumeSpecName: "scripts") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.808801 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.808885 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx" (OuterVolumeSpecName: "kube-api-access-vmvkx") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "kube-api-access-vmvkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.809511 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.810671 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.810863 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq" (OuterVolumeSpecName: "kube-api-access-v4frq") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "kube-api-access-v4frq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.821479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts" (OuterVolumeSpecName: "scripts") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.833453 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data" (OuterVolumeSpecName: "config-data") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.838116 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.841780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6360034b-9d22-4426-91a5-85d646ea907b" (UID: "6360034b-9d22-4426-91a5-85d646ea907b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.866129 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data" (OuterVolumeSpecName: "config-data") pod "1e23266e-1b86-4f4f-9a54-c6082fd38384" (UID: "1e23266e-1b86-4f4f-9a54-c6082fd38384"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907112 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907168 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907179 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907188 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907199 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4frq\" (UniqueName: \"kubernetes.io/projected/6360034b-9d22-4426-91a5-85d646ea907b-kube-api-access-v4frq\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907209 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907217 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e23266e-1b86-4f4f-9a54-c6082fd38384-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907227 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmvkx\" (UniqueName: \"kubernetes.io/projected/1e23266e-1b86-4f4f-9a54-c6082fd38384-kube-api-access-vmvkx\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907236 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907245 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.907252 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6360034b-9d22-4426-91a5-85d646ea907b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.934094 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.961723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-pwcxw" event={"ID":"6360034b-9d22-4426-91a5-85d646ea907b","Type":"ContainerDied","Data":"54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799"} Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.961746 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-pwcxw" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.961763 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54b8dc775827ee008d36f6c8dfbfdb1c519f6ecd3301b14a223535360382b799" Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.964151 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"1e23266e-1b86-4f4f-9a54-c6082fd38384","Type":"ContainerDied","Data":"af7e7554f878baf9e8f4979d8f337f6dfa9169ddb40d7ee6660932bc0867c0aa"} Dec 03 17:19:09 crc kubenswrapper[4841]: I1203 17:19:09.964226 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.965516 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-gj6sq" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.983786 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.983944 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wm9mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-drrfd_openstack(6a7b606d-af6a-477c-9ac6-f93db645651d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:19:09 crc kubenswrapper[4841]: E1203 17:19:09.985154 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-drrfd" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.008440 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.010031 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.031017 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042162 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:19:10 crc kubenswrapper[4841]: E1203 17:19:10.042551 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-log" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042568 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-log" Dec 03 17:19:10 crc kubenswrapper[4841]: E1203 17:19:10.042584 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6360034b-9d22-4426-91a5-85d646ea907b" containerName="keystone-bootstrap" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042591 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6360034b-9d22-4426-91a5-85d646ea907b" containerName="keystone-bootstrap" Dec 03 17:19:10 crc kubenswrapper[4841]: E1203 17:19:10.042620 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-httpd" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042626 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-httpd" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042782 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6360034b-9d22-4426-91a5-85d646ea907b" containerName="keystone-bootstrap" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042822 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-log" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.042834 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" containerName="glance-httpd" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.043743 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.046412 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.046609 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.051843 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211645 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211664 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211706 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211721 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd426\" (UniqueName: \"kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211738 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.211796 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.248832 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e23266e-1b86-4f4f-9a54-c6082fd38384" path="/var/lib/kubelet/pods/1e23266e-1b86-4f4f-9a54-c6082fd38384/volumes" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.249677 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3df7b49f-8dc8-4c98-ad9a-faed6b2439ec" path="/var/lib/kubelet/pods/3df7b49f-8dc8-4c98-ad9a-faed6b2439ec/volumes" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313218 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313263 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313287 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd426\" (UniqueName: \"kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313312 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313392 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313456 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.313761 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.314141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.314326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.318661 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.319047 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.320648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.321237 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.333462 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd426\" (UniqueName: \"kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.362275 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.373263 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.796895 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-pwcxw"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.804124 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-pwcxw"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.904434 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-8lggq"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.905807 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.912440 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lggq"] Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.918385 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.918379 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qfbt" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.918421 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.918467 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 17:19:10 crc kubenswrapper[4841]: I1203 17:19:10.918527 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 17:19:10 crc kubenswrapper[4841]: E1203 17:19:10.973533 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-drrfd" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024032 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024109 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024160 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-926l6\" (UniqueName: \"kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024294 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024504 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.024539 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.126789 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.126865 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.126928 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-926l6\" (UniqueName: \"kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.126957 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.126998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.127013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.131578 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.131846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.132369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.134431 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.134555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.144540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-926l6\" (UniqueName: \"kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6\") pod \"keystone-bootstrap-8lggq\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:11 crc kubenswrapper[4841]: I1203 17:19:11.237258 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:12 crc kubenswrapper[4841]: I1203 17:19:12.248863 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6360034b-9d22-4426-91a5-85d646ea907b" path="/var/lib/kubelet/pods/6360034b-9d22-4426-91a5-85d646ea907b/volumes" Dec 03 17:19:14 crc kubenswrapper[4841]: I1203 17:19:14.858537 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-znrqh" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.019131 4841 generic.go:334] "Generic (PLEG): container finished" podID="e6ce602c-aad4-4d9d-a924-a200b8d8658d" containerID="9de148a07dcb85b7edd57f5ca6a713f4e0997490566e3dab9624a50d096a551f" exitCode=0 Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.019178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8dbcn" event={"ID":"e6ce602c-aad4-4d9d-a924-a200b8d8658d","Type":"ContainerDied","Data":"9de148a07dcb85b7edd57f5ca6a713f4e0997490566e3dab9624a50d096a551f"} Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.509925 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.551422 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config\") pod \"0d1d957c-03f0-472f-888f-f410cb214bba\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.551547 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc\") pod \"0d1d957c-03f0-472f-888f-f410cb214bba\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.551614 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb\") pod \"0d1d957c-03f0-472f-888f-f410cb214bba\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.551704 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb\") pod \"0d1d957c-03f0-472f-888f-f410cb214bba\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.551784 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ww448\" (UniqueName: \"kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448\") pod \"0d1d957c-03f0-472f-888f-f410cb214bba\" (UID: \"0d1d957c-03f0-472f-888f-f410cb214bba\") " Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.558223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448" (OuterVolumeSpecName: "kube-api-access-ww448") pod "0d1d957c-03f0-472f-888f-f410cb214bba" (UID: "0d1d957c-03f0-472f-888f-f410cb214bba"). InnerVolumeSpecName "kube-api-access-ww448". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.609717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config" (OuterVolumeSpecName: "config") pod "0d1d957c-03f0-472f-888f-f410cb214bba" (UID: "0d1d957c-03f0-472f-888f-f410cb214bba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.626825 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0d1d957c-03f0-472f-888f-f410cb214bba" (UID: "0d1d957c-03f0-472f-888f-f410cb214bba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.629370 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0d1d957c-03f0-472f-888f-f410cb214bba" (UID: "0d1d957c-03f0-472f-888f-f410cb214bba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.632779 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0d1d957c-03f0-472f-888f-f410cb214bba" (UID: "0d1d957c-03f0-472f-888f-f410cb214bba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.654027 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ww448\" (UniqueName: \"kubernetes.io/projected/0d1d957c-03f0-472f-888f-f410cb214bba-kube-api-access-ww448\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.654085 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.654097 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.654109 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:17 crc kubenswrapper[4841]: I1203 17:19:17.654138 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0d1d957c-03f0-472f-888f-f410cb214bba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:17 crc kubenswrapper[4841]: E1203 17:19:17.873744 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Dec 03 17:19:17 crc kubenswrapper[4841]: E1203 17:19:17.873985 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n588h694h69hf7h57dh557h679h557hf7h579h6h65h9bh555h74h594h586h6dhf7hdbh8fh5dch56bh676h54chb4h64h5b8hbdh55fh579h5bdq,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-52l7f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(bd5216b6-064c-45a5-868c-816d362eced0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.028732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-znrqh" event={"ID":"0d1d957c-03f0-472f-888f-f410cb214bba","Type":"ContainerDied","Data":"6fb7dc9843eeafde5febd4d60b3c72b2de2bbad7ef5cd675bae90ca03582c364"} Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.028765 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-znrqh" Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.079842 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.088264 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-znrqh"] Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.251072 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" path="/var/lib/kubelet/pods/0d1d957c-03f0-472f-888f-f410cb214bba/volumes" Dec 03 17:19:18 crc kubenswrapper[4841]: E1203 17:19:18.930937 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 17:19:18 crc kubenswrapper[4841]: E1203 17:19:18.931111 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sg7k6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-82zhd_openstack(c094953c-fc36-4dda-9497-381f9ae48471): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 17:19:18 crc kubenswrapper[4841]: E1203 17:19:18.932367 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-82zhd" podUID="c094953c-fc36-4dda-9497-381f9ae48471" Dec 03 17:19:18 crc kubenswrapper[4841]: I1203 17:19:18.937311 4841 scope.go:117] "RemoveContainer" containerID="1f98a46f9130f48503513e3982adb5d8453cda50deb785e715a2b280bd6b9e82" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.081171 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-8dbcn" event={"ID":"e6ce602c-aad4-4d9d-a924-a200b8d8658d","Type":"ContainerDied","Data":"01b11265860ef7d559eb6cc83dcf1c77fca03bfbfca440a6a20cd64b42c50193"} Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.081481 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b11265860ef7d559eb6cc83dcf1c77fca03bfbfca440a6a20cd64b42c50193" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.097798 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:19:19 crc kubenswrapper[4841]: E1203 17:19:19.143601 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-82zhd" podUID="c094953c-fc36-4dda-9497-381f9ae48471" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.143949 4841 scope.go:117] "RemoveContainer" containerID="e11a8dbc5f1be6ba2686e11670aa11966266c6314e9773ecdf136dcdd2c84ad9" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.181047 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle\") pod \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.181101 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8c7j5\" (UniqueName: \"kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5\") pod \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.181253 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config\") pod \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\" (UID: \"e6ce602c-aad4-4d9d-a924-a200b8d8658d\") " Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.187217 4841 scope.go:117] "RemoveContainer" containerID="3430a262df7488cf7cadd3b02087a329c1e9e795e400ef525d1b62fe323132c5" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.197577 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5" (OuterVolumeSpecName: "kube-api-access-8c7j5") pod "e6ce602c-aad4-4d9d-a924-a200b8d8658d" (UID: "e6ce602c-aad4-4d9d-a924-a200b8d8658d"). InnerVolumeSpecName "kube-api-access-8c7j5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.214753 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config" (OuterVolumeSpecName: "config") pod "e6ce602c-aad4-4d9d-a924-a200b8d8658d" (UID: "e6ce602c-aad4-4d9d-a924-a200b8d8658d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.218047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e6ce602c-aad4-4d9d-a924-a200b8d8658d" (UID: "e6ce602c-aad4-4d9d-a924-a200b8d8658d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.258930 4841 scope.go:117] "RemoveContainer" containerID="f7c37848c1193c51c133fd068f98e947890b68d25d11f0a02566abbca90f818a" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.283197 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.283221 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6ce602c-aad4-4d9d-a924-a200b8d8658d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.283232 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8c7j5\" (UniqueName: \"kubernetes.io/projected/e6ce602c-aad4-4d9d-a924-a200b8d8658d-kube-api-access-8c7j5\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.308318 4841 scope.go:117] "RemoveContainer" containerID="dd07afb0b783f3ea965266c320416a3dab1c19712d9646d949f37f48af741f70" Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.500968 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.592106 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-8lggq"] Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.608641 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:19:19 crc kubenswrapper[4841]: W1203 17:19:19.641860 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod51ec7e80_3ef6_478e_b388_5835beb1c733.slice/crio-a1e807e1e69a3892c513ca6d36478667318c9aefcc7f05faad42e0fa6d3b9759 WatchSource:0}: Error finding container a1e807e1e69a3892c513ca6d36478667318c9aefcc7f05faad42e0fa6d3b9759: Status 404 returned error can't find the container with id a1e807e1e69a3892c513ca6d36478667318c9aefcc7f05faad42e0fa6d3b9759 Dec 03 17:19:19 crc kubenswrapper[4841]: W1203 17:19:19.645087 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc366f0b0_a1f7_4452_93c9_d408d117d651.slice/crio-0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028 WatchSource:0}: Error finding container 0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028: Status 404 returned error can't find the container with id 0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028 Dec 03 17:19:19 crc kubenswrapper[4841]: I1203 17:19:19.859885 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-znrqh" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.112:5353: i/o timeout" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.100384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerStarted","Data":"a1e807e1e69a3892c513ca6d36478667318c9aefcc7f05faad42e0fa6d3b9759"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.101651 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lggq" event={"ID":"c366f0b0-a1f7-4452-93c9-d408d117d651","Type":"ContainerStarted","Data":"4c9447f0ac6d6d48eaf6dae4a6ff308647d900b06d0b0415878fd555669c1cbd"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.101687 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lggq" event={"ID":"c366f0b0-a1f7-4452-93c9-d408d117d651","Type":"ContainerStarted","Data":"0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.102801 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdwlx" event={"ID":"b2fec744-9e89-4330-88a5-f0e4c2173870","Type":"ContainerStarted","Data":"7ffa8405ccdd37a45e196fc903cd0c7bd7ab8e307ffe8880f5de6bb5ede6c861"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.105422 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerStarted","Data":"fe89a96ac4cacdfeec1e4a4f568c21786cd28ebe31e752d237149751ff912400"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.106975 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerStarted","Data":"000e6d5dbe0b6a01884099e5baec768c622b87c904b98a1bb2f122c1e18013ca"} Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.107003 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-8dbcn" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.129386 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-8lggq" podStartSLOduration=10.129366274 podStartE2EDuration="10.129366274s" podCreationTimestamp="2025-12-03 17:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:20.118202474 +0000 UTC m=+1154.505723201" watchObservedRunningTime="2025-12-03 17:19:20.129366274 +0000 UTC m=+1154.516886991" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.149405 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-gdwlx" podStartSLOduration=4.32985178 podStartE2EDuration="27.149385852s" podCreationTimestamp="2025-12-03 17:18:53 +0000 UTC" firstStartedPulling="2025-12-03 17:18:54.581447286 +0000 UTC m=+1128.968968013" lastFinishedPulling="2025-12-03 17:19:17.400981358 +0000 UTC m=+1151.788502085" observedRunningTime="2025-12-03 17:19:20.144232751 +0000 UTC m=+1154.531753478" watchObservedRunningTime="2025-12-03 17:19:20.149385852 +0000 UTC m=+1154.536906569" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.280377 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:20 crc kubenswrapper[4841]: E1203 17:19:20.281285 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ce602c-aad4-4d9d-a924-a200b8d8658d" containerName="neutron-db-sync" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.281305 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ce602c-aad4-4d9d-a924-a200b8d8658d" containerName="neutron-db-sync" Dec 03 17:19:20 crc kubenswrapper[4841]: E1203 17:19:20.281334 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="init" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.281343 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="init" Dec 03 17:19:20 crc kubenswrapper[4841]: E1203 17:19:20.281360 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.281367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.281603 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ce602c-aad4-4d9d-a924-a200b8d8658d" containerName="neutron-db-sync" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.281622 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d1d957c-03f0-472f-888f-f410cb214bba" containerName="dnsmasq-dns" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.282704 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.303536 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416492 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416721 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4jh2\" (UniqueName: \"kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.416780 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.457484 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.461838 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.465985 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.466218 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.466466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-2pvqc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.466581 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.469091 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519241 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519268 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4jh2\" (UniqueName: \"kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519363 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrs2g\" (UniqueName: \"kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519733 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.519809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.520962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.521652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.524203 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.533154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.533692 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.539023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4jh2\" (UniqueName: \"kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2\") pod \"dnsmasq-dns-55f844cf75-5jctc\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.621301 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.621421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.621481 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.621516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrs2g\" (UniqueName: \"kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.621575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.627682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.627687 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.629071 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.632304 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.641587 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrs2g\" (UniqueName: \"kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g\") pod \"neutron-774586549b-jbmpq\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.804484 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:20 crc kubenswrapper[4841]: I1203 17:19:20.815432 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:21.156780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerStarted","Data":"100f840d1f2804bb16eec10829512b68717d3687c059282364e1be370829b397"} Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:21.168591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerStarted","Data":"aaf61c90e47e2bab624d684c2d66061e20f328133d254a00df33a94a9fcbbb8c"} Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.183100 4841 generic.go:334] "Generic (PLEG): container finished" podID="b2fec744-9e89-4330-88a5-f0e4c2173870" containerID="7ffa8405ccdd37a45e196fc903cd0c7bd7ab8e307ffe8880f5de6bb5ede6c861" exitCode=0 Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.183181 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdwlx" event={"ID":"b2fec744-9e89-4330-88a5-f0e4c2173870","Type":"ContainerDied","Data":"7ffa8405ccdd37a45e196fc903cd0c7bd7ab8e307ffe8880f5de6bb5ede6c861"} Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.187781 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerStarted","Data":"b8032a3c094bb56d3d5f5ce52a22a9c276d899fcdea69130fe26a77e81be7ca7"} Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.190758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerStarted","Data":"e2558236d0746a73d50a3a05b611cd0dba02053bc82e2eeab22e659894be5e5b"} Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.229320 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=14.22930588 podStartE2EDuration="14.22930588s" podCreationTimestamp="2025-12-03 17:19:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:22.228626794 +0000 UTC m=+1156.616147521" watchObservedRunningTime="2025-12-03 17:19:22.22930588 +0000 UTC m=+1156.616826607" Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.255308 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=12.255292177 podStartE2EDuration="12.255292177s" podCreationTimestamp="2025-12-03 17:19:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:22.249752027 +0000 UTC m=+1156.637272754" watchObservedRunningTime="2025-12-03 17:19:22.255292177 +0000 UTC m=+1156.642812904" Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.358550 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:22 crc kubenswrapper[4841]: W1203 17:19:22.368341 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9bcc5c9_d525_4d1c_b74c_c6301b17967f.slice/crio-8812f7acf0803152e32800b26c4b69ec00b72790d7bf44abe6a191492096e180 WatchSource:0}: Error finding container 8812f7acf0803152e32800b26c4b69ec00b72790d7bf44abe6a191492096e180: Status 404 returned error can't find the container with id 8812f7acf0803152e32800b26c4b69ec00b72790d7bf44abe6a191492096e180 Dec 03 17:19:22 crc kubenswrapper[4841]: I1203 17:19:22.527933 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:22 crc kubenswrapper[4841]: W1203 17:19:22.541759 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7751cb9c_f0c8_4db4_b53f_5ac0ae8f16a9.slice/crio-4274cdcce8c8c6ccb7641f9847dc315d12748a223a2bee7ae85f2442060208f5 WatchSource:0}: Error finding container 4274cdcce8c8c6ccb7641f9847dc315d12748a223a2bee7ae85f2442060208f5: Status 404 returned error can't find the container with id 4274cdcce8c8c6ccb7641f9847dc315d12748a223a2bee7ae85f2442060208f5 Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.057202 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6fbf7d7cfc-n8r2b"] Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.059098 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.060745 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.060950 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.072408 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fbf7d7cfc-n8r2b"] Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176227 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-public-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176271 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-combined-ca-bundle\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176363 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-ovndb-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176403 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lqkk\" (UniqueName: \"kubernetes.io/projected/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-kube-api-access-6lqkk\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176436 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-internal-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.176466 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-httpd-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.204247 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerID="6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74" exitCode=0 Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.204329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" event={"ID":"f9bcc5c9-d525-4d1c-b74c-c6301b17967f","Type":"ContainerDied","Data":"6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74"} Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.204361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" event={"ID":"f9bcc5c9-d525-4d1c-b74c-c6301b17967f","Type":"ContainerStarted","Data":"8812f7acf0803152e32800b26c4b69ec00b72790d7bf44abe6a191492096e180"} Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.213404 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerStarted","Data":"4274cdcce8c8c6ccb7641f9847dc315d12748a223a2bee7ae85f2442060208f5"} Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278341 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-combined-ca-bundle\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278722 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-ovndb-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278824 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lqkk\" (UniqueName: \"kubernetes.io/projected/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-kube-api-access-6lqkk\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278893 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-internal-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.278947 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-httpd-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.279073 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-public-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.284997 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-ovndb-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.285895 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-public-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.286154 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-internal-tls-certs\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.290505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-httpd-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.290898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-combined-ca-bundle\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.290963 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-config\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.299024 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lqkk\" (UniqueName: \"kubernetes.io/projected/7e14311f-cc1f-454b-af0b-94a5cf3ed4e3-kube-api-access-6lqkk\") pod \"neutron-6fbf7d7cfc-n8r2b\" (UID: \"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3\") " pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:23 crc kubenswrapper[4841]: I1203 17:19:23.381094 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:24 crc kubenswrapper[4841]: I1203 17:19:24.237810 4841 generic.go:334] "Generic (PLEG): container finished" podID="c366f0b0-a1f7-4452-93c9-d408d117d651" containerID="4c9447f0ac6d6d48eaf6dae4a6ff308647d900b06d0b0415878fd555669c1cbd" exitCode=0 Dec 03 17:19:24 crc kubenswrapper[4841]: I1203 17:19:24.238013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lggq" event={"ID":"c366f0b0-a1f7-4452-93c9-d408d117d651","Type":"ContainerDied","Data":"4c9447f0ac6d6d48eaf6dae4a6ff308647d900b06d0b0415878fd555669c1cbd"} Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.587314 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.605807 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.605939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.606647 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-926l6\" (UniqueName: \"kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.606698 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.606769 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.606856 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle\") pod \"c366f0b0-a1f7-4452-93c9-d408d117d651\" (UID: \"c366f0b0-a1f7-4452-93c9-d408d117d651\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.616844 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdwlx" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.641191 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.650298 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.652147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6" (OuterVolumeSpecName: "kube-api-access-926l6") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "kube-api-access-926l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.656269 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts" (OuterVolumeSpecName: "scripts") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.687970 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.708221 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs\") pod \"b2fec744-9e89-4330-88a5-f0e4c2173870\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.708600 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs" (OuterVolumeSpecName: "logs") pod "b2fec744-9e89-4330-88a5-f0e4c2173870" (UID: "b2fec744-9e89-4330-88a5-f0e4c2173870"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.708728 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q78sd\" (UniqueName: \"kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd\") pod \"b2fec744-9e89-4330-88a5-f0e4c2173870\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709118 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle\") pod \"b2fec744-9e89-4330-88a5-f0e4c2173870\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709143 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data\") pod \"b2fec744-9e89-4330-88a5-f0e4c2173870\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts\") pod \"b2fec744-9e89-4330-88a5-f0e4c2173870\" (UID: \"b2fec744-9e89-4330-88a5-f0e4c2173870\") " Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709832 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709851 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709862 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2fec744-9e89-4330-88a5-f0e4c2173870-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709871 4841 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709880 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-926l6\" (UniqueName: \"kubernetes.io/projected/c366f0b0-a1f7-4452-93c9-d408d117d651-kube-api-access-926l6\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.709888 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.714105 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts" (OuterVolumeSpecName: "scripts") pod "b2fec744-9e89-4330-88a5-f0e4c2173870" (UID: "b2fec744-9e89-4330-88a5-f0e4c2173870"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.714186 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd" (OuterVolumeSpecName: "kube-api-access-q78sd") pod "b2fec744-9e89-4330-88a5-f0e4c2173870" (UID: "b2fec744-9e89-4330-88a5-f0e4c2173870"). InnerVolumeSpecName "kube-api-access-q78sd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.727538 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data" (OuterVolumeSpecName: "config-data") pod "c366f0b0-a1f7-4452-93c9-d408d117d651" (UID: "c366f0b0-a1f7-4452-93c9-d408d117d651"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.755795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data" (OuterVolumeSpecName: "config-data") pod "b2fec744-9e89-4330-88a5-f0e4c2173870" (UID: "b2fec744-9e89-4330-88a5-f0e4c2173870"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.761059 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2fec744-9e89-4330-88a5-f0e4c2173870" (UID: "b2fec744-9e89-4330-88a5-f0e4c2173870"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.811459 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q78sd\" (UniqueName: \"kubernetes.io/projected/b2fec744-9e89-4330-88a5-f0e4c2173870-kube-api-access-q78sd\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.811488 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c366f0b0-a1f7-4452-93c9-d408d117d651-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.811498 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.811507 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.811516 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2fec744-9e89-4330-88a5-f0e4c2173870-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:28 crc kubenswrapper[4841]: I1203 17:19:28.943961 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6fbf7d7cfc-n8r2b"] Dec 03 17:19:28 crc kubenswrapper[4841]: W1203 17:19:28.947934 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e14311f_cc1f_454b_af0b_94a5cf3ed4e3.slice/crio-66fb3d726caf68419453145cae39b87fe6752592e6378387ec35198a5fbcf7f0 WatchSource:0}: Error finding container 66fb3d726caf68419453145cae39b87fe6752592e6378387ec35198a5fbcf7f0: Status 404 returned error can't find the container with id 66fb3d726caf68419453145cae39b87fe6752592e6378387ec35198a5fbcf7f0 Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.345965 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-gdwlx" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.345965 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-gdwlx" event={"ID":"b2fec744-9e89-4330-88a5-f0e4c2173870","Type":"ContainerDied","Data":"717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.346419 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="717c5c5a087ed366c07199e2d501b1349f73ee9001a2c7a310bb59bb6bb3373f" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.347472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" event={"ID":"f9bcc5c9-d525-4d1c-b74c-c6301b17967f","Type":"ContainerStarted","Data":"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.347584 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.349513 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.349538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.359938 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerStarted","Data":"bdde33202f03aa340bb118003bd07d691d2d2f5eba6e187eaace5f92277ee8a6"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.366200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbf7d7cfc-n8r2b" event={"ID":"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3","Type":"ContainerStarted","Data":"d304d8488816c217f10c0b1d94c5a57da1cce937482bdad542d64038a68b203c"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.366235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbf7d7cfc-n8r2b" event={"ID":"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3","Type":"ContainerStarted","Data":"918ef61b270cad15f85ae5d2168d73c652d422312c8037a26cfd7e66d739ff53"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.366245 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6fbf7d7cfc-n8r2b" event={"ID":"7e14311f-cc1f-454b-af0b-94a5cf3ed4e3","Type":"ContainerStarted","Data":"66fb3d726caf68419453145cae39b87fe6752592e6378387ec35198a5fbcf7f0"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.367131 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.378252 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" podStartSLOduration=9.378231258 podStartE2EDuration="9.378231258s" podCreationTimestamp="2025-12-03 17:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:29.370027726 +0000 UTC m=+1163.757548453" watchObservedRunningTime="2025-12-03 17:19:29.378231258 +0000 UTC m=+1163.765751985" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.389392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerStarted","Data":"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.389428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerStarted","Data":"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.389440 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.391557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gj6sq" event={"ID":"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa","Type":"ContainerStarted","Data":"f7b88de14b208885be22036f254026a8e37099177dcb084d9369893a26997d5e"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.396800 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.402026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-8lggq" event={"ID":"c366f0b0-a1f7-4452-93c9-d408d117d651","Type":"ContainerDied","Data":"0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.402078 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0751a9a7c6f71c691220b42d3173d03868f08ecaa04cc7594d8db3b7b7357028" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.402151 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-8lggq" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.413222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.416144 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6fbf7d7cfc-n8r2b" podStartSLOduration=6.416129693 podStartE2EDuration="6.416129693s" podCreationTimestamp="2025-12-03 17:19:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:29.398177393 +0000 UTC m=+1163.785698120" watchObservedRunningTime="2025-12-03 17:19:29.416129693 +0000 UTC m=+1163.803650420" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.416367 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-drrfd" event={"ID":"6a7b606d-af6a-477c-9ac6-f93db645651d","Type":"ContainerStarted","Data":"f06c4fa375358c54eef0791779277a440dec87b0219d207917bc57f5f83f2919"} Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.434856 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-gj6sq" podStartSLOduration=2.248299484 podStartE2EDuration="36.434835699s" podCreationTimestamp="2025-12-03 17:18:53 +0000 UTC" firstStartedPulling="2025-12-03 17:18:54.47495407 +0000 UTC m=+1128.862474797" lastFinishedPulling="2025-12-03 17:19:28.661490285 +0000 UTC m=+1163.049011012" observedRunningTime="2025-12-03 17:19:29.422076451 +0000 UTC m=+1163.809597188" watchObservedRunningTime="2025-12-03 17:19:29.434835699 +0000 UTC m=+1163.822356426" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.443892 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-774586549b-jbmpq" podStartSLOduration=9.44387633 podStartE2EDuration="9.44387633s" podCreationTimestamp="2025-12-03 17:19:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:29.440081692 +0000 UTC m=+1163.827602419" watchObservedRunningTime="2025-12-03 17:19:29.44387633 +0000 UTC m=+1163.831397057" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.526573 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-drrfd" podStartSLOduration=2.953377219 podStartE2EDuration="37.52655333s" podCreationTimestamp="2025-12-03 17:18:52 +0000 UTC" firstStartedPulling="2025-12-03 17:18:54.043428956 +0000 UTC m=+1128.430949683" lastFinishedPulling="2025-12-03 17:19:28.616605067 +0000 UTC m=+1163.004125794" observedRunningTime="2025-12-03 17:19:29.523561661 +0000 UTC m=+1163.911082388" watchObservedRunningTime="2025-12-03 17:19:29.52655333 +0000 UTC m=+1163.914074067" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.729049 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-59cdc4489f-kzmfh"] Dec 03 17:19:29 crc kubenswrapper[4841]: E1203 17:19:29.730285 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c366f0b0-a1f7-4452-93c9-d408d117d651" containerName="keystone-bootstrap" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.730355 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c366f0b0-a1f7-4452-93c9-d408d117d651" containerName="keystone-bootstrap" Dec 03 17:19:29 crc kubenswrapper[4841]: E1203 17:19:29.730417 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2fec744-9e89-4330-88a5-f0e4c2173870" containerName="placement-db-sync" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.730472 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fec744-9e89-4330-88a5-f0e4c2173870" containerName="placement-db-sync" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.730696 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2fec744-9e89-4330-88a5-f0e4c2173870" containerName="placement-db-sync" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.730754 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c366f0b0-a1f7-4452-93c9-d408d117d651" containerName="keystone-bootstrap" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.731382 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.734087 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.734453 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.734604 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.734794 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.734963 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-4qfbt" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.735109 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.755087 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59cdc4489f-kzmfh"] Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849213 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-scripts\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849641 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-fernet-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-config-data\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849775 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8qzb\" (UniqueName: \"kubernetes.io/projected/471f221b-da02-49b2-901b-c8afd7aa38c5-kube-api-access-f8qzb\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-public-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849839 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-internal-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849860 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-credential-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.849880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-combined-ca-bundle\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.852766 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-59984668d4-h88x4"] Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.855638 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.857589 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-x78cp" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.858252 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.860773 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.861096 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.865358 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.865622 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59984668d4-h88x4"] Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-internal-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951644 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753fad74-21af-48dd-ae45-1162eb580f22-logs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w88b8\" (UniqueName: \"kubernetes.io/projected/753fad74-21af-48dd-ae45-1162eb580f22-kube-api-access-w88b8\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-config-data\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951891 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8qzb\" (UniqueName: \"kubernetes.io/projected/471f221b-da02-49b2-901b-c8afd7aa38c5-kube-api-access-f8qzb\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951949 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-config-data\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.951972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-scripts\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952004 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-public-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952021 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-internal-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952042 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-credential-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952059 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-combined-ca-bundle\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-scripts\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952182 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-public-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952526 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-fernet-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.952581 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-combined-ca-bundle\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.959016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-credential-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.959478 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-scripts\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.959633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-internal-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.959966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-combined-ca-bundle\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.961474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-config-data\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.961522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-public-tls-certs\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.963474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/471f221b-da02-49b2-901b-c8afd7aa38c5-fernet-keys\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:29 crc kubenswrapper[4841]: I1203 17:19:29.973125 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8qzb\" (UniqueName: \"kubernetes.io/projected/471f221b-da02-49b2-901b-c8afd7aa38c5-kube-api-access-f8qzb\") pod \"keystone-59cdc4489f-kzmfh\" (UID: \"471f221b-da02-49b2-901b-c8afd7aa38c5\") " pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054057 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054618 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-internal-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753fad74-21af-48dd-ae45-1162eb580f22-logs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054717 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w88b8\" (UniqueName: \"kubernetes.io/projected/753fad74-21af-48dd-ae45-1162eb580f22-kube-api-access-w88b8\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054760 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-config-data\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054779 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-scripts\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054836 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-public-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.054870 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-combined-ca-bundle\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.055262 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/753fad74-21af-48dd-ae45-1162eb580f22-logs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.058827 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-internal-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.059573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-config-data\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.060563 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-combined-ca-bundle\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.061118 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-public-tls-certs\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.065197 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/753fad74-21af-48dd-ae45-1162eb580f22-scripts\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.080643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w88b8\" (UniqueName: \"kubernetes.io/projected/753fad74-21af-48dd-ae45-1162eb580f22-kube-api-access-w88b8\") pod \"placement-59984668d4-h88x4\" (UID: \"753fad74-21af-48dd-ae45-1162eb580f22\") " pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.174072 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.373960 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.374021 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.408925 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.434919 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.459600 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.459659 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.459672 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 17:19:30 crc kubenswrapper[4841]: I1203 17:19:30.459683 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 17:19:31 crc kubenswrapper[4841]: I1203 17:19:31.401757 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-59cdc4489f-kzmfh"] Dec 03 17:19:31 crc kubenswrapper[4841]: W1203 17:19:31.406955 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod471f221b_da02_49b2_901b_c8afd7aa38c5.slice/crio-bc589ce6b8d4be3543fc449f8271a31111dec7e24436e8c3132af02b4868c600 WatchSource:0}: Error finding container bc589ce6b8d4be3543fc449f8271a31111dec7e24436e8c3132af02b4868c600: Status 404 returned error can't find the container with id bc589ce6b8d4be3543fc449f8271a31111dec7e24436e8c3132af02b4868c600 Dec 03 17:19:31 crc kubenswrapper[4841]: I1203 17:19:31.420263 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-59984668d4-h88x4"] Dec 03 17:19:31 crc kubenswrapper[4841]: I1203 17:19:31.463827 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59984668d4-h88x4" event={"ID":"753fad74-21af-48dd-ae45-1162eb580f22","Type":"ContainerStarted","Data":"a5c8908d0ebb5e5d1cceb24a0361de877a5fa46a610cecac1356685ef720db03"} Dec 03 17:19:31 crc kubenswrapper[4841]: I1203 17:19:31.465838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59cdc4489f-kzmfh" event={"ID":"471f221b-da02-49b2-901b-c8afd7aa38c5","Type":"ContainerStarted","Data":"bc589ce6b8d4be3543fc449f8271a31111dec7e24436e8c3132af02b4868c600"} Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.476112 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-59cdc4489f-kzmfh" event={"ID":"471f221b-da02-49b2-901b-c8afd7aa38c5","Type":"ContainerStarted","Data":"99e52b41d3944c47644282d6fcd7a0bce4e5c06e9caff6b01577a6e1507112a3"} Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.476856 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478788 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59984668d4-h88x4" event={"ID":"753fad74-21af-48dd-ae45-1162eb580f22","Type":"ContainerStarted","Data":"16ead74805e1c627e18799e4ce17379d235c5b061342fad04f9e6092c20a358f"} Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478824 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478842 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-59984668d4-h88x4" event={"ID":"753fad74-21af-48dd-ae45-1162eb580f22","Type":"ContainerStarted","Data":"aa1528fff20d5ba79689b0c450ba79f6d06e133d0b8568b994919c1f455818b9"} Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478942 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.478964 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.479080 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.481920 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.487048 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.496628 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-59cdc4489f-kzmfh" podStartSLOduration=3.496613189 podStartE2EDuration="3.496613189s" podCreationTimestamp="2025-12-03 17:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:32.494221443 +0000 UTC m=+1166.881742170" watchObservedRunningTime="2025-12-03 17:19:32.496613189 +0000 UTC m=+1166.884133916" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.523447 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-59984668d4-h88x4" podStartSLOduration=3.523422295 podStartE2EDuration="3.523422295s" podCreationTimestamp="2025-12-03 17:19:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:32.517417935 +0000 UTC m=+1166.904938662" watchObservedRunningTime="2025-12-03 17:19:32.523422295 +0000 UTC m=+1166.910943032" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.666743 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:32 crc kubenswrapper[4841]: I1203 17:19:32.735627 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 17:19:33 crc kubenswrapper[4841]: I1203 17:19:33.487641 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:19:34 crc kubenswrapper[4841]: E1203 17:19:34.608442 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod084bb3a0_2558_43d3_a0f5_7ac7cd02c3fa.slice/crio-f7b88de14b208885be22036f254026a8e37099177dcb084d9369893a26997d5e.scope\": RecentStats: unable to find data in memory cache]" Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.519957 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-82zhd" event={"ID":"c094953c-fc36-4dda-9497-381f9ae48471","Type":"ContainerStarted","Data":"49dbc947db12cfbe84efcd55391a15aa3b2a13e182496b08f555ed86684a8885"} Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.521805 4841 generic.go:334] "Generic (PLEG): container finished" podID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" containerID="f7b88de14b208885be22036f254026a8e37099177dcb084d9369893a26997d5e" exitCode=0 Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.521830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gj6sq" event={"ID":"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa","Type":"ContainerDied","Data":"f7b88de14b208885be22036f254026a8e37099177dcb084d9369893a26997d5e"} Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.544417 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-82zhd" podStartSLOduration=3.2720925960000002 podStartE2EDuration="42.544401072s" podCreationTimestamp="2025-12-03 17:18:53 +0000 UTC" firstStartedPulling="2025-12-03 17:18:54.476786073 +0000 UTC m=+1128.864306800" lastFinishedPulling="2025-12-03 17:19:33.749094549 +0000 UTC m=+1168.136615276" observedRunningTime="2025-12-03 17:19:35.537948361 +0000 UTC m=+1169.925469088" watchObservedRunningTime="2025-12-03 17:19:35.544401072 +0000 UTC m=+1169.931921789" Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.806306 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.874238 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:19:35 crc kubenswrapper[4841]: I1203 17:19:35.874678 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="dnsmasq-dns" containerID="cri-o://b8ceb4deb350fd244cdb0a09938369b3859bf9fcdbd40f3d9ca04d4c73082ec1" gracePeriod=10 Dec 03 17:19:36 crc kubenswrapper[4841]: I1203 17:19:36.534531 4841 generic.go:334] "Generic (PLEG): container finished" podID="f982a994-2292-49fb-9b20-4f78c9730210" containerID="b8ceb4deb350fd244cdb0a09938369b3859bf9fcdbd40f3d9ca04d4c73082ec1" exitCode=0 Dec 03 17:19:36 crc kubenswrapper[4841]: I1203 17:19:36.534575 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" event={"ID":"f982a994-2292-49fb-9b20-4f78c9730210","Type":"ContainerDied","Data":"b8ceb4deb350fd244cdb0a09938369b3859bf9fcdbd40f3d9ca04d4c73082ec1"} Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.564694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-gj6sq" event={"ID":"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa","Type":"ContainerDied","Data":"94e220eef109e4dd9375041dbff128f2443c05eea616de394c57fd65864317fd"} Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.565281 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94e220eef109e4dd9375041dbff128f2443c05eea616de394c57fd65864317fd" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.568097 4841 generic.go:334] "Generic (PLEG): container finished" podID="6a7b606d-af6a-477c-9ac6-f93db645651d" containerID="f06c4fa375358c54eef0791779277a440dec87b0219d207917bc57f5f83f2919" exitCode=0 Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.568155 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-drrfd" event={"ID":"6a7b606d-af6a-477c-9ac6-f93db645651d","Type":"ContainerDied","Data":"f06c4fa375358c54eef0791779277a440dec87b0219d207917bc57f5f83f2919"} Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.630491 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.732309 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcrv\" (UniqueName: \"kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv\") pod \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.732507 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data\") pod \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.732542 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle\") pod \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\" (UID: \"084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa\") " Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.750090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv" (OuterVolumeSpecName: "kube-api-access-xdcrv") pod "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" (UID: "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa"). InnerVolumeSpecName "kube-api-access-xdcrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.750099 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" (UID: "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.787662 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" (UID: "084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.834389 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.834414 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:38 crc kubenswrapper[4841]: I1203 17:19:38.834423 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdcrv\" (UniqueName: \"kubernetes.io/projected/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa-kube-api-access-xdcrv\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.317126 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.317207 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.332830 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.444827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.444895 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vlq5m\" (UniqueName: \"kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.444956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.444978 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.445083 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.445114 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0\") pod \"f982a994-2292-49fb-9b20-4f78c9730210\" (UID: \"f982a994-2292-49fb-9b20-4f78c9730210\") " Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.452064 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m" (OuterVolumeSpecName: "kube-api-access-vlq5m") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "kube-api-access-vlq5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.496385 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config" (OuterVolumeSpecName: "config") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.499556 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.499630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.508285 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.516145 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f982a994-2292-49fb-9b20-4f78c9730210" (UID: "f982a994-2292-49fb-9b20-4f78c9730210"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.546955 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.546985 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.546995 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.547006 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vlq5m\" (UniqueName: \"kubernetes.io/projected/f982a994-2292-49fb-9b20-4f78c9730210-kube-api-access-vlq5m\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.547016 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.547025 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f982a994-2292-49fb-9b20-4f78c9730210-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.582247 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-gj6sq" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.585265 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.585876 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" event={"ID":"f982a994-2292-49fb-9b20-4f78c9730210","Type":"ContainerDied","Data":"b78b85cef084a674494aaf17a47a4babd04a8f3296288cb39eb3129a91727fcd"} Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.585960 4841 scope.go:117] "RemoveContainer" containerID="b8ceb4deb350fd244cdb0a09938369b3859bf9fcdbd40f3d9ca04d4c73082ec1" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.643140 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.653802 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-2vksl"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.847993 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-745f9599f8-67b5b"] Dec 03 17:19:39 crc kubenswrapper[4841]: E1203 17:19:39.848515 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="init" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.848537 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="init" Dec 03 17:19:39 crc kubenswrapper[4841]: E1203 17:19:39.848578 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" containerName="barbican-db-sync" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.848588 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" containerName="barbican-db-sync" Dec 03 17:19:39 crc kubenswrapper[4841]: E1203 17:19:39.848599 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="dnsmasq-dns" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.848609 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="dnsmasq-dns" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.848818 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="dnsmasq-dns" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.848852 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" containerName="barbican-db-sync" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.870262 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-b7bb8bfcf-5cwg2"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.870406 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.872487 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.874445 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.874674 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.875489 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.890290 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-skp9h" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.890923 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b7bb8bfcf-5cwg2"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.916013 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-745f9599f8-67b5b"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vspvj\" (UniqueName: \"kubernetes.io/projected/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-kube-api-access-vspvj\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-combined-ca-bundle\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-logs\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data-custom\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955414 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec85a1b0-91a1-4d24-a64b-239c100a7861-logs\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955456 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data-custom\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955497 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn5lv\" (UniqueName: \"kubernetes.io/projected/ec85a1b0-91a1-4d24-a64b-239c100a7861-kube-api-access-nn5lv\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-combined-ca-bundle\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.955547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.991424 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:39 crc kubenswrapper[4841]: I1203 17:19:39.992962 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.005964 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057082 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec85a1b0-91a1-4d24-a64b-239c100a7861-logs\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057157 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data-custom\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057208 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn5lv\" (UniqueName: \"kubernetes.io/projected/ec85a1b0-91a1-4d24-a64b-239c100a7861-kube-api-access-nn5lv\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057229 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-combined-ca-bundle\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057262 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vspvj\" (UniqueName: \"kubernetes.io/projected/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-kube-api-access-vspvj\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057327 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-combined-ca-bundle\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057354 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-logs\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.057377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data-custom\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.048085 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.059334 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ec85a1b0-91a1-4d24-a64b-239c100a7861-logs\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.060417 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.060501 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.061865 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-logs\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.063038 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.072633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.073369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-config-data-custom\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.073879 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-combined-ca-bundle\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.074498 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data-custom\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.075227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ec85a1b0-91a1-4d24-a64b-239c100a7861-config-data\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.079220 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn5lv\" (UniqueName: \"kubernetes.io/projected/ec85a1b0-91a1-4d24-a64b-239c100a7861-kube-api-access-nn5lv\") pod \"barbican-keystone-listener-745f9599f8-67b5b\" (UID: \"ec85a1b0-91a1-4d24-a64b-239c100a7861\") " pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.079674 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-combined-ca-bundle\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.084327 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vspvj\" (UniqueName: \"kubernetes.io/projected/e2bde16a-a813-49ff-ac28-11cf8d1dfac4-kube-api-access-vspvj\") pod \"barbican-worker-b7bb8bfcf-5cwg2\" (UID: \"e2bde16a-a813-49ff-ac28-11cf8d1dfac4\") " pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160005 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160109 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwznx\" (UniqueName: \"kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160162 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160242 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160279 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6ql\" (UniqueName: \"kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160317 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160339 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160354 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160414 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.160433 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.224335 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.233997 4841 scope.go:117] "RemoveContainer" containerID="510aecc00db9e1df7bb526f80c2867276455332dad5125a58e7abc1cf9561e6d" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.241617 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.254078 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f982a994-2292-49fb-9b20-4f78c9730210" path="/var/lib/kubelet/pods/f982a994-2292-49fb-9b20-4f78c9730210/volumes" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6ql\" (UniqueName: \"kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262668 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262711 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262768 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262790 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262838 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwznx\" (UniqueName: \"kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262857 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262883 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.262925 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.263633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.264582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.265644 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.265801 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.266240 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.267195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.270032 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.270188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.271001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.279657 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6ql\" (UniqueName: \"kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql\") pod \"barbican-api-7cc745dffb-pksm6\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.281541 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwznx\" (UniqueName: \"kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx\") pod \"dnsmasq-dns-85ff748b95-6kv82\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.285319 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-drrfd" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.335673 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.363897 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wm9mt\" (UniqueName: \"kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt\") pod \"6a7b606d-af6a-477c-9ac6-f93db645651d\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.364043 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle\") pod \"6a7b606d-af6a-477c-9ac6-f93db645651d\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.364076 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data\") pod \"6a7b606d-af6a-477c-9ac6-f93db645651d\" (UID: \"6a7b606d-af6a-477c-9ac6-f93db645651d\") " Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.368961 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt" (OuterVolumeSpecName: "kube-api-access-wm9mt") pod "6a7b606d-af6a-477c-9ac6-f93db645651d" (UID: "6a7b606d-af6a-477c-9ac6-f93db645651d"). InnerVolumeSpecName "kube-api-access-wm9mt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.389658 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a7b606d-af6a-477c-9ac6-f93db645651d" (UID: "6a7b606d-af6a-477c-9ac6-f93db645651d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.457513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.471246 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wm9mt\" (UniqueName: \"kubernetes.io/projected/6a7b606d-af6a-477c-9ac6-f93db645651d-kube-api-access-wm9mt\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.475062 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.482213 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data" (OuterVolumeSpecName: "config-data") pod "6a7b606d-af6a-477c-9ac6-f93db645651d" (UID: "6a7b606d-af6a-477c-9ac6-f93db645651d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.578488 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a7b606d-af6a-477c-9ac6-f93db645651d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.637093 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-drrfd" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.638941 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-drrfd" event={"ID":"6a7b606d-af6a-477c-9ac6-f93db645651d","Type":"ContainerDied","Data":"f6910c71e7e18e9b0a7488a3fd6a2a419128c94e6bf97fdcc21f38ac41b6bd9b"} Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.638983 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6910c71e7e18e9b0a7488a3fd6a2a419128c94e6bf97fdcc21f38ac41b6bd9b" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.643369 4841 generic.go:334] "Generic (PLEG): container finished" podID="c094953c-fc36-4dda-9497-381f9ae48471" containerID="49dbc947db12cfbe84efcd55391a15aa3b2a13e182496b08f555ed86684a8885" exitCode=0 Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.643414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-82zhd" event={"ID":"c094953c-fc36-4dda-9497-381f9ae48471","Type":"ContainerDied","Data":"49dbc947db12cfbe84efcd55391a15aa3b2a13e182496b08f555ed86684a8885"} Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.686149 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-745f9599f8-67b5b"] Dec 03 17:19:40 crc kubenswrapper[4841]: E1203 17:19:40.758516 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="bd5216b6-064c-45a5-868c-816d362eced0" Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.982637 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-b7bb8bfcf-5cwg2"] Dec 03 17:19:40 crc kubenswrapper[4841]: I1203 17:19:40.991604 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.130219 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:19:41 crc kubenswrapper[4841]: W1203 17:19:41.134630 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28e39677_20d4_410e_80d6_8321877f674d.slice/crio-6753657589eb205e8273e2833897b3b31f5ba6989801d0014e56e3387001a442 WatchSource:0}: Error finding container 6753657589eb205e8273e2833897b3b31f5ba6989801d0014e56e3387001a442: Status 404 returned error can't find the container with id 6753657589eb205e8273e2833897b3b31f5ba6989801d0014e56e3387001a442 Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.658587 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerStarted","Data":"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.658953 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerStarted","Data":"bee514c7ed8c2b9ef9d4f29943f32a1ee4aa32aa8b7a0d05295c9c11a37e6bbe"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.672533 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" event={"ID":"e2bde16a-a813-49ff-ac28-11cf8d1dfac4","Type":"ContainerStarted","Data":"96fba1b27921e9266a1d92792ad5f1b848fcfdc7ba4adc4b8f81d79949c6b993"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.674694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerStarted","Data":"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.674743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerStarted","Data":"6753657589eb205e8273e2833897b3b31f5ba6989801d0014e56e3387001a442"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.677826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" event={"ID":"ec85a1b0-91a1-4d24-a64b-239c100a7861","Type":"ContainerStarted","Data":"6cc84c2fc7b5af3bc6e3fc2c6f5eb5d248e816e63f82edf16e4ff494f70b83c4"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.698798 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="ceilometer-notification-agent" containerID="cri-o://000e6d5dbe0b6a01884099e5baec768c622b87c904b98a1bb2f122c1e18013ca" gracePeriod=30 Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.699007 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerStarted","Data":"f9a89c326f139c89bfa6e95e7c08bb7b36af06f30d7f8aa9f3917f057c7485d0"} Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.699054 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.699135 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="sg-core" containerID="cri-o://bdde33202f03aa340bb118003bd07d691d2d2f5eba6e187eaace5f92277ee8a6" gracePeriod=30 Dec 03 17:19:41 crc kubenswrapper[4841]: I1203 17:19:41.699282 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="proxy-httpd" containerID="cri-o://f9a89c326f139c89bfa6e95e7c08bb7b36af06f30d7f8aa9f3917f057c7485d0" gracePeriod=30 Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.091885 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-82zhd" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.207848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.207957 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sg7k6\" (UniqueName: \"kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.207990 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.208016 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.208248 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.208585 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts\") pod \"c094953c-fc36-4dda-9497-381f9ae48471\" (UID: \"c094953c-fc36-4dda-9497-381f9ae48471\") " Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.208691 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.209184 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c094953c-fc36-4dda-9497-381f9ae48471-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.213053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6" (OuterVolumeSpecName: "kube-api-access-sg7k6") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "kube-api-access-sg7k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.213074 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.218999 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts" (OuterVolumeSpecName: "scripts") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.241788 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.270091 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data" (OuterVolumeSpecName: "config-data") pod "c094953c-fc36-4dda-9497-381f9ae48471" (UID: "c094953c-fc36-4dda-9497-381f9ae48471"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.310770 4841 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.310802 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sg7k6\" (UniqueName: \"kubernetes.io/projected/c094953c-fc36-4dda-9497-381f9ae48471-kube-api-access-sg7k6\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.311017 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.311026 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.311034 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c094953c-fc36-4dda-9497-381f9ae48471-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.712819 4841 generic.go:334] "Generic (PLEG): container finished" podID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerID="40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4" exitCode=0 Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.713011 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerDied","Data":"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.713192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerStarted","Data":"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.713238 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.720607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerStarted","Data":"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.722230 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.722312 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.730637 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-82zhd" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.730634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-82zhd" event={"ID":"c094953c-fc36-4dda-9497-381f9ae48471","Type":"ContainerDied","Data":"8cb206a9bb64000cd8d0871a84ddb75b2ed59f8e6d869a4013fca8ff3c227306"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.731003 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb206a9bb64000cd8d0871a84ddb75b2ed59f8e6d869a4013fca8ff3c227306" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744452 4841 generic.go:334] "Generic (PLEG): container finished" podID="bd5216b6-064c-45a5-868c-816d362eced0" containerID="f9a89c326f139c89bfa6e95e7c08bb7b36af06f30d7f8aa9f3917f057c7485d0" exitCode=0 Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744481 4841 generic.go:334] "Generic (PLEG): container finished" podID="bd5216b6-064c-45a5-868c-816d362eced0" containerID="bdde33202f03aa340bb118003bd07d691d2d2f5eba6e187eaace5f92277ee8a6" exitCode=2 Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744490 4841 generic.go:334] "Generic (PLEG): container finished" podID="bd5216b6-064c-45a5-868c-816d362eced0" containerID="000e6d5dbe0b6a01884099e5baec768c622b87c904b98a1bb2f122c1e18013ca" exitCode=0 Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744510 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerDied","Data":"f9a89c326f139c89bfa6e95e7c08bb7b36af06f30d7f8aa9f3917f057c7485d0"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744534 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerDied","Data":"bdde33202f03aa340bb118003bd07d691d2d2f5eba6e187eaace5f92277ee8a6"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.744544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerDied","Data":"000e6d5dbe0b6a01884099e5baec768c622b87c904b98a1bb2f122c1e18013ca"} Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.748813 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" podStartSLOduration=3.748790565 podStartE2EDuration="3.748790565s" podCreationTimestamp="2025-12-03 17:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:42.743585653 +0000 UTC m=+1177.131106380" watchObservedRunningTime="2025-12-03 17:19:42.748790565 +0000 UTC m=+1177.136311312" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.860090 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7cc745dffb-pksm6" podStartSLOduration=2.860067853 podStartE2EDuration="2.860067853s" podCreationTimestamp="2025-12-03 17:19:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:42.777047464 +0000 UTC m=+1177.164568191" watchObservedRunningTime="2025-12-03 17:19:42.860067853 +0000 UTC m=+1177.247588580" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.862874 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:42 crc kubenswrapper[4841]: E1203 17:19:42.863227 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" containerName="heat-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.863246 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" containerName="heat-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: E1203 17:19:42.863269 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c094953c-fc36-4dda-9497-381f9ae48471" containerName="cinder-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.863276 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c094953c-fc36-4dda-9497-381f9ae48471" containerName="cinder-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.863450 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" containerName="heat-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.863477 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c094953c-fc36-4dda-9497-381f9ae48471" containerName="cinder-db-sync" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.864604 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.869349 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.869565 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.869734 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.870751 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-cwpsc" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.922617 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fzkv\" (UniqueName: \"kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931465 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.931550 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.933463 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.963317 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-57c8cf4866-6qqks"] Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.964794 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.976650 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.976836 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 17:19:42 crc kubenswrapper[4841]: I1203 17:19:42.981251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57c8cf4866-6qqks"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.007985 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.010183 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.022728 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033239 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033328 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data-custom\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033394 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033488 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2adaec8-2204-42ce-bc82-2f7e45008cad-logs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62x4v\" (UniqueName: \"kubernetes.io/projected/a2adaec8-2204-42ce-bc82-2f7e45008cad-kube-api-access-62x4v\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033563 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-public-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033583 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-internal-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033608 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033645 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fzkv\" (UniqueName: \"kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.033701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-combined-ca-bundle\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.042778 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.050321 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.050400 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.050808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.050933 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.057825 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fzkv\" (UniqueName: \"kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv\") pod \"cinder-scheduler-0\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.060419 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.061862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.064194 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.115759 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.135665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-public-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.135725 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-internal-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.135771 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.135968 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.135999 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wc4v\" (UniqueName: \"kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136335 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136373 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136547 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-combined-ca-bundle\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136622 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136659 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136690 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data-custom\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136715 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136797 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj9sp\" (UniqueName: \"kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.136932 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.137257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.137484 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2adaec8-2204-42ce-bc82-2f7e45008cad-logs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.137600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62x4v\" (UniqueName: \"kubernetes.io/projected/a2adaec8-2204-42ce-bc82-2f7e45008cad-kube-api-access-62x4v\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.138053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2adaec8-2204-42ce-bc82-2f7e45008cad-logs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.141518 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-public-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.143238 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data-custom\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.143566 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-config-data\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.151323 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-combined-ca-bundle\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.153028 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a2adaec8-2204-42ce-bc82-2f7e45008cad-internal-tls-certs\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.156612 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62x4v\" (UniqueName: \"kubernetes.io/projected/a2adaec8-2204-42ce-bc82-2f7e45008cad-kube-api-access-62x4v\") pod \"barbican-api-57c8cf4866-6qqks\" (UID: \"a2adaec8-2204-42ce-bc82-2f7e45008cad\") " pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240380 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240449 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240535 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj9sp\" (UniqueName: \"kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240658 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wc4v\" (UniqueName: \"kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240708 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240726 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240741 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240759 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240832 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.240922 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.241961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.242482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.242519 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.242989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.243355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.246137 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.246390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.246480 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.247119 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.249261 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.258088 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj9sp\" (UniqueName: \"kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp\") pod \"dnsmasq-dns-5c9776ccc5-6xl2c\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.262172 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wc4v\" (UniqueName: \"kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v\") pod \"cinder-api-0\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.275830 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.294724 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.341786 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.418006 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.490050 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.546782 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.546865 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.547004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.547079 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.547106 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.547151 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.547216 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52l7f\" (UniqueName: \"kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f\") pod \"bd5216b6-064c-45a5-868c-816d362eced0\" (UID: \"bd5216b6-064c-45a5-868c-816d362eced0\") " Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.549130 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.549270 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.553612 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts" (OuterVolumeSpecName: "scripts") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.558093 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f" (OuterVolumeSpecName: "kube-api-access-52l7f") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "kube-api-access-52l7f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.584279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.632484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650084 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650111 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650120 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650129 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52l7f\" (UniqueName: \"kubernetes.io/projected/bd5216b6-064c-45a5-868c-816d362eced0-kube-api-access-52l7f\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650138 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.650146 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bd5216b6-064c-45a5-868c-816d362eced0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.667099 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data" (OuterVolumeSpecName: "config-data") pod "bd5216b6-064c-45a5-868c-816d362eced0" (UID: "bd5216b6-064c-45a5-868c-816d362eced0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.755119 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd5216b6-064c-45a5-868c-816d362eced0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.780340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bd5216b6-064c-45a5-868c-816d362eced0","Type":"ContainerDied","Data":"a27ec0622467c98ef63ccb403bfad0c174efda540736a571b2dcba6ab4e53bd1"} Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.780488 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.780527 4841 scope.go:117] "RemoveContainer" containerID="f9a89c326f139c89bfa6e95e7c08bb7b36af06f30d7f8aa9f3917f057c7485d0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.793749 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-2vksl" podUID="f982a994-2292-49fb-9b20-4f78c9730210" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.143:5353: i/o timeout" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.880018 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.889271 4841 scope.go:117] "RemoveContainer" containerID="bdde33202f03aa340bb118003bd07d691d2d2f5eba6e187eaace5f92277ee8a6" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.890417 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.903944 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: E1203 17:19:43.904351 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="sg-core" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="sg-core" Dec 03 17:19:43 crc kubenswrapper[4841]: E1203 17:19:43.904383 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="proxy-httpd" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904392 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="proxy-httpd" Dec 03 17:19:43 crc kubenswrapper[4841]: E1203 17:19:43.904409 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="ceilometer-notification-agent" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904416 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="ceilometer-notification-agent" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904585 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="ceilometer-notification-agent" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904599 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="proxy-httpd" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.904616 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd5216b6-064c-45a5-868c-816d362eced0" containerName="sg-core" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.906072 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.906194 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.910347 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.910363 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.914073 4841 scope.go:117] "RemoveContainer" containerID="000e6d5dbe0b6a01884099e5baec768c622b87c904b98a1bb2f122c1e18013ca" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.960313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.960840 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phr8z\" (UniqueName: \"kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.960875 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.960921 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.961196 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.961248 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:43 crc kubenswrapper[4841]: I1203 17:19:43.961308 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: W1203 17:19:44.036709 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36bb6744_f5a9_43ec_98a8_762b5d2cb88d.slice/crio-ad9f7f04fb9a63bb89cd07fcaf44d7f61e751c9375be5efcb70e68903849a581 WatchSource:0}: Error finding container ad9f7f04fb9a63bb89cd07fcaf44d7f61e751c9375be5efcb70e68903849a581: Status 404 returned error can't find the container with id ad9f7f04fb9a63bb89cd07fcaf44d7f61e751c9375be5efcb70e68903849a581 Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.040097 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.064671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.064793 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phr8z\" (UniqueName: \"kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.065052 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.065081 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.065120 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.065135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.065160 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.067124 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.067775 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.075729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.075858 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.083409 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.083668 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.086628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phr8z\" (UniqueName: \"kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z\") pod \"ceilometer-0\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.119704 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:19:44 crc kubenswrapper[4841]: W1203 17:19:44.133123 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd07669f7_1f8e_4e31_96a6_f3f3fcea5315.slice/crio-a28a83f5c5f1491ff01d82e508a3d0b7b9af421e4c6f89b669eb6d6ef8e66bba WatchSource:0}: Error finding container a28a83f5c5f1491ff01d82e508a3d0b7b9af421e4c6f89b669eb6d6ef8e66bba: Status 404 returned error can't find the container with id a28a83f5c5f1491ff01d82e508a3d0b7b9af421e4c6f89b669eb6d6ef8e66bba Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.156980 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-57c8cf4866-6qqks"] Dec 03 17:19:44 crc kubenswrapper[4841]: W1203 17:19:44.157509 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2adaec8_2204_42ce_bc82_2f7e45008cad.slice/crio-183b016330ab318caef14f1dbb76b21659eeaaa7a39f1f23ba53f31373cec67e WatchSource:0}: Error finding container 183b016330ab318caef14f1dbb76b21659eeaaa7a39f1f23ba53f31373cec67e: Status 404 returned error can't find the container with id 183b016330ab318caef14f1dbb76b21659eeaaa7a39f1f23ba53f31373cec67e Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.233348 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.282674 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd5216b6-064c-45a5-868c-816d362eced0" path="/var/lib/kubelet/pods/bd5216b6-064c-45a5-868c-816d362eced0/volumes" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.284026 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.712326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.804980 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" event={"ID":"e2bde16a-a813-49ff-ac28-11cf8d1dfac4","Type":"ContainerStarted","Data":"88917dba9ef2e2f2355c49db7de96f751c2ba27f2e7d566d7b0d8ac398f0adf2"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.805021 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" event={"ID":"e2bde16a-a813-49ff-ac28-11cf8d1dfac4","Type":"ContainerStarted","Data":"1a9b80b4a5af2ef751a107aaf8bd153eeb9709e7563382ff136d99fd2159341c"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.817418 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerStarted","Data":"ac12bd895dffa53448fdef84a9d94b6cb8adb9110a74c6f790502c0418eabe53"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.820034 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerStarted","Data":"4bb681c57fe9de8c2d6969192bbfa38b038cf9fc8d6716a7304c659acb1961b4"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.824141 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-b7bb8bfcf-5cwg2" podStartSLOduration=3.258528487 podStartE2EDuration="5.824126444s" podCreationTimestamp="2025-12-03 17:19:39 +0000 UTC" firstStartedPulling="2025-12-03 17:19:40.968118053 +0000 UTC m=+1175.355638780" lastFinishedPulling="2025-12-03 17:19:43.533716 +0000 UTC m=+1177.921236737" observedRunningTime="2025-12-03 17:19:44.82093554 +0000 UTC m=+1179.208456267" watchObservedRunningTime="2025-12-03 17:19:44.824126444 +0000 UTC m=+1179.211647171" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.832684 4841 generic.go:334] "Generic (PLEG): container finished" podID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerID="b745da054c75ee382772c823e2a56ac2b9387306d6ab5ebbe652542183d563ef" exitCode=0 Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.832778 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" event={"ID":"d07669f7-1f8e-4e31-96a6-f3f3fcea5315","Type":"ContainerDied","Data":"b745da054c75ee382772c823e2a56ac2b9387306d6ab5ebbe652542183d563ef"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.832864 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" event={"ID":"d07669f7-1f8e-4e31-96a6-f3f3fcea5315","Type":"ContainerStarted","Data":"a28a83f5c5f1491ff01d82e508a3d0b7b9af421e4c6f89b669eb6d6ef8e66bba"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.849832 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerStarted","Data":"ad9f7f04fb9a63bb89cd07fcaf44d7f61e751c9375be5efcb70e68903849a581"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.863618 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" event={"ID":"ec85a1b0-91a1-4d24-a64b-239c100a7861","Type":"ContainerStarted","Data":"5e00360ea9863f2548c3836f7c4be80968a07a7b6afcf8585dcfca6d23fb0218"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.863923 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" event={"ID":"ec85a1b0-91a1-4d24-a64b-239c100a7861","Type":"ContainerStarted","Data":"5734dc50b1aba76d11baa76db7bd34f73654bdd58c481f4759291b6f27ae2569"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.869383 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c8cf4866-6qqks" event={"ID":"a2adaec8-2204-42ce-bc82-2f7e45008cad","Type":"ContainerStarted","Data":"1cc3ef5ce3e8c789a806a911da40cc4c9d101d48d8739132c18f4d1fb0140451"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.869406 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c8cf4866-6qqks" event={"ID":"a2adaec8-2204-42ce-bc82-2f7e45008cad","Type":"ContainerStarted","Data":"f50ed673b0e0389ba932d93245f8b4379dedd7482c9a1c94f5dfd0a9aa0e696f"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.869414 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-57c8cf4866-6qqks" event={"ID":"a2adaec8-2204-42ce-bc82-2f7e45008cad","Type":"ContainerStarted","Data":"183b016330ab318caef14f1dbb76b21659eeaaa7a39f1f23ba53f31373cec67e"} Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.869692 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="dnsmasq-dns" containerID="cri-o://d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d" gracePeriod=10 Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.898280 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-745f9599f8-67b5b" podStartSLOduration=3.072451734 podStartE2EDuration="5.898255935s" podCreationTimestamp="2025-12-03 17:19:39 +0000 UTC" firstStartedPulling="2025-12-03 17:19:40.678540283 +0000 UTC m=+1175.066061010" lastFinishedPulling="2025-12-03 17:19:43.504344484 +0000 UTC m=+1177.891865211" observedRunningTime="2025-12-03 17:19:44.883806788 +0000 UTC m=+1179.271327525" watchObservedRunningTime="2025-12-03 17:19:44.898255935 +0000 UTC m=+1179.285776662" Dec 03 17:19:44 crc kubenswrapper[4841]: I1203 17:19:44.948841 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-57c8cf4866-6qqks" podStartSLOduration=2.948822616 podStartE2EDuration="2.948822616s" podCreationTimestamp="2025-12-03 17:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:44.920086795 +0000 UTC m=+1179.307607522" watchObservedRunningTime="2025-12-03 17:19:44.948822616 +0000 UTC m=+1179.336343343" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.443340 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.507829 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.507881 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.507955 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.507988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.508121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.508209 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwznx\" (UniqueName: \"kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx\") pod \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\" (UID: \"ea6b1c58-53be-4b41-a32f-8eea3589e1a5\") " Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.540248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx" (OuterVolumeSpecName: "kube-api-access-lwznx") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "kube-api-access-lwznx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.610034 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwznx\" (UniqueName: \"kubernetes.io/projected/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-kube-api-access-lwznx\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.629790 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.642347 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config" (OuterVolumeSpecName: "config") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.644242 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.659465 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.664307 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ea6b1c58-53be-4b41-a32f-8eea3589e1a5" (UID: "ea6b1c58-53be-4b41-a32f-8eea3589e1a5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.711730 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.711761 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.711771 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.711783 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.711791 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea6b1c58-53be-4b41-a32f-8eea3589e1a5-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.888094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerStarted","Data":"a411dfe32aa66ef5ebc2bc6688c83ea76a3226d765da217a8342bc9ca163bc86"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.893409 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" event={"ID":"d07669f7-1f8e-4e31-96a6-f3f3fcea5315","Type":"ContainerStarted","Data":"943505f1a01545930727cd0e63319a51aa58e4fc1a2b5576fecfbc66e542f176"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.893588 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.896173 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerStarted","Data":"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.899005 4841 generic.go:334] "Generic (PLEG): container finished" podID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerID="d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d" exitCode=0 Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.899057 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerDied","Data":"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.899074 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" event={"ID":"ea6b1c58-53be-4b41-a32f-8eea3589e1a5","Type":"ContainerDied","Data":"bee514c7ed8c2b9ef9d4f29943f32a1ee4aa32aa8b7a0d05295c9c11a37e6bbe"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.899105 4841 scope.go:117] "RemoveContainer" containerID="d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.899541 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-6kv82" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.902737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerStarted","Data":"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6"} Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.903576 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.903608 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.919270 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" podStartSLOduration=3.9192508310000003 podStartE2EDuration="3.919250831s" podCreationTimestamp="2025-12-03 17:19:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:45.914023129 +0000 UTC m=+1180.301543866" watchObservedRunningTime="2025-12-03 17:19:45.919250831 +0000 UTC m=+1180.306771558" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.938406 4841 scope.go:117] "RemoveContainer" containerID="40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.972488 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.979228 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-6kv82"] Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.991948 4841 scope.go:117] "RemoveContainer" containerID="d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d" Dec 03 17:19:45 crc kubenswrapper[4841]: E1203 17:19:45.992353 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d\": container with ID starting with d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d not found: ID does not exist" containerID="d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.992392 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d"} err="failed to get container status \"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d\": rpc error: code = NotFound desc = could not find container \"d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d\": container with ID starting with d2c321feb2c7fa4a8c759ebca3b472aa69c4497459f112a9a97451ada5d1494d not found: ID does not exist" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.992419 4841 scope.go:117] "RemoveContainer" containerID="40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4" Dec 03 17:19:45 crc kubenswrapper[4841]: E1203 17:19:45.992716 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4\": container with ID starting with 40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4 not found: ID does not exist" containerID="40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4" Dec 03 17:19:45 crc kubenswrapper[4841]: I1203 17:19:45.992754 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4"} err="failed to get container status \"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4\": rpc error: code = NotFound desc = could not find container \"40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4\": container with ID starting with 40527c7c67e738ef96d7ad3dbe5f1f4a18e7c481eecdf6f3cb0cdfd9161d86b4 not found: ID does not exist" Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.177556 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.270827 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" path="/var/lib/kubelet/pods/ea6b1c58-53be-4b41-a32f-8eea3589e1a5/volumes" Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.912045 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerStarted","Data":"eb4a8452ed0f0f33acb38d03f13f2051ccb657dabc84d31ca6a34e54df5ff2cc"} Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.914322 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerStarted","Data":"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8"} Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.914554 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.917352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerStarted","Data":"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993"} Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.948058 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.9480375 podStartE2EDuration="3.9480375s" podCreationTimestamp="2025-12-03 17:19:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:46.93648222 +0000 UTC m=+1181.324002957" watchObservedRunningTime="2025-12-03 17:19:46.9480375 +0000 UTC m=+1181.335558247" Dec 03 17:19:46 crc kubenswrapper[4841]: I1203 17:19:46.972350 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.263056869 podStartE2EDuration="4.972332187s" podCreationTimestamp="2025-12-03 17:19:42 +0000 UTC" firstStartedPulling="2025-12-03 17:19:44.04089059 +0000 UTC m=+1178.428411317" lastFinishedPulling="2025-12-03 17:19:44.750165908 +0000 UTC m=+1179.137686635" observedRunningTime="2025-12-03 17:19:46.971212931 +0000 UTC m=+1181.358733648" watchObservedRunningTime="2025-12-03 17:19:46.972332187 +0000 UTC m=+1181.359852934" Dec 03 17:19:47 crc kubenswrapper[4841]: I1203 17:19:47.938433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerStarted","Data":"8572b36aa5f483884060c16bc9e294bb2caa6c20893fea4c9e2e9252200236cb"} Dec 03 17:19:47 crc kubenswrapper[4841]: I1203 17:19:47.938627 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api-log" containerID="cri-o://0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" gracePeriod=30 Dec 03 17:19:47 crc kubenswrapper[4841]: I1203 17:19:47.939208 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api" containerID="cri-o://0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" gracePeriod=30 Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.276321 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.460621 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572751 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572812 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572881 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572933 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.572990 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wc4v\" (UniqueName: \"kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.573057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs\") pod \"d82a5adc-8e5d-4394-a96f-2bfad235d269\" (UID: \"d82a5adc-8e5d-4394-a96f-2bfad235d269\") " Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.573795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs" (OuterVolumeSpecName: "logs") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.574208 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.577434 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v" (OuterVolumeSpecName: "kube-api-access-5wc4v") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "kube-api-access-5wc4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.577635 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts" (OuterVolumeSpecName: "scripts") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.577806 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.596478 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.637530 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data" (OuterVolumeSpecName: "config-data") pod "d82a5adc-8e5d-4394-a96f-2bfad235d269" (UID: "d82a5adc-8e5d-4394-a96f-2bfad235d269"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675773 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675831 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675858 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675879 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d82a5adc-8e5d-4394-a96f-2bfad235d269-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675897 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wc4v\" (UniqueName: \"kubernetes.io/projected/d82a5adc-8e5d-4394-a96f-2bfad235d269-kube-api-access-5wc4v\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675946 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d82a5adc-8e5d-4394-a96f-2bfad235d269-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.675964 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d82a5adc-8e5d-4394-a96f-2bfad235d269-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.954814 4841 generic.go:334] "Generic (PLEG): container finished" podID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerID="0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" exitCode=0 Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.954863 4841 generic.go:334] "Generic (PLEG): container finished" podID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerID="0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" exitCode=143 Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.954955 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerDied","Data":"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8"} Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.954993 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerDied","Data":"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6"} Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.955008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d82a5adc-8e5d-4394-a96f-2bfad235d269","Type":"ContainerDied","Data":"4bb681c57fe9de8c2d6969192bbfa38b038cf9fc8d6716a7304c659acb1961b4"} Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.955037 4841 scope.go:117] "RemoveContainer" containerID="0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.955259 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.964991 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerStarted","Data":"486365a09e373d1ef8a8361fd6355529f52a5a366911d17b2c9deed60994958b"} Dec 03 17:19:48 crc kubenswrapper[4841]: I1203 17:19:48.966145 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.016662 4841 scope.go:117] "RemoveContainer" containerID="0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.046149 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.574838021 podStartE2EDuration="6.046128081s" podCreationTimestamp="2025-12-03 17:19:43 +0000 UTC" firstStartedPulling="2025-12-03 17:19:44.752519343 +0000 UTC m=+1179.140040090" lastFinishedPulling="2025-12-03 17:19:48.223809423 +0000 UTC m=+1182.611330150" observedRunningTime="2025-12-03 17:19:49.012371733 +0000 UTC m=+1183.399892510" watchObservedRunningTime="2025-12-03 17:19:49.046128081 +0000 UTC m=+1183.433648818" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.052819 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.056586 4841 scope.go:117] "RemoveContainer" containerID="0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.057318 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8\": container with ID starting with 0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8 not found: ID does not exist" containerID="0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.057382 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8"} err="failed to get container status \"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8\": rpc error: code = NotFound desc = could not find container \"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8\": container with ID starting with 0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8 not found: ID does not exist" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.057425 4841 scope.go:117] "RemoveContainer" containerID="0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.057877 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6\": container with ID starting with 0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6 not found: ID does not exist" containerID="0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.057953 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6"} err="failed to get container status \"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6\": rpc error: code = NotFound desc = could not find container \"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6\": container with ID starting with 0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6 not found: ID does not exist" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.057991 4841 scope.go:117] "RemoveContainer" containerID="0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.058467 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8"} err="failed to get container status \"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8\": rpc error: code = NotFound desc = could not find container \"0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8\": container with ID starting with 0d7a5bb7e25c477ea036a5512ceb6f3ba1da70b3c26d3b0002895ef3247471f8 not found: ID does not exist" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.058500 4841 scope.go:117] "RemoveContainer" containerID="0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.059232 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6"} err="failed to get container status \"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6\": rpc error: code = NotFound desc = could not find container \"0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6\": container with ID starting with 0aa9ae32502d190314ae5c10f95c4008fd0994fdd21a92df5ff06bf884639aa6 not found: ID does not exist" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.068176 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079143 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.079498 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="init" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079515 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="init" Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.079531 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="dnsmasq-dns" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079538 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="dnsmasq-dns" Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.079550 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api-log" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079556 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api-log" Dec 03 17:19:49 crc kubenswrapper[4841]: E1203 17:19:49.079571 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079576 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079726 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079745 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea6b1c58-53be-4b41-a32f-8eea3589e1a5" containerName="dnsmasq-dns" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.079756 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" containerName="cinder-api-log" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.082366 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.089539 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.091106 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.091518 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.094639 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197418 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197688 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-logs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data-custom\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.197965 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.198177 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-987gz\" (UniqueName: \"kubernetes.io/projected/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-kube-api-access-987gz\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.198302 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.198384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-scripts\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301463 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-logs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301623 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data-custom\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301660 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301693 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-987gz\" (UniqueName: \"kubernetes.io/projected/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-kube-api-access-987gz\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-scripts\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301810 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.301834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.302134 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-logs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.302392 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.308506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-scripts\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.308712 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data-custom\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.309633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-config-data\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.309978 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.322772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.325267 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-987gz\" (UniqueName: \"kubernetes.io/projected/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-kube-api-access-987gz\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.337626 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c12abe1e-e6a0-4bed-9bab-feb7bf43622d-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c12abe1e-e6a0-4bed-9bab-feb7bf43622d\") " pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.441567 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.950079 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 17:19:49 crc kubenswrapper[4841]: W1203 17:19:49.956854 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc12abe1e_e6a0_4bed_9bab_feb7bf43622d.slice/crio-b21dfc9124a51e52153d4c8d2c58f2f5347f03aa7beeb896129dc12d4a985393 WatchSource:0}: Error finding container b21dfc9124a51e52153d4c8d2c58f2f5347f03aa7beeb896129dc12d4a985393: Status 404 returned error can't find the container with id b21dfc9124a51e52153d4c8d2c58f2f5347f03aa7beeb896129dc12d4a985393 Dec 03 17:19:49 crc kubenswrapper[4841]: I1203 17:19:49.986072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c12abe1e-e6a0-4bed-9bab-feb7bf43622d","Type":"ContainerStarted","Data":"b21dfc9124a51e52153d4c8d2c58f2f5347f03aa7beeb896129dc12d4a985393"} Dec 03 17:19:50 crc kubenswrapper[4841]: I1203 17:19:50.255555 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d82a5adc-8e5d-4394-a96f-2bfad235d269" path="/var/lib/kubelet/pods/d82a5adc-8e5d-4394-a96f-2bfad235d269/volumes" Dec 03 17:19:50 crc kubenswrapper[4841]: I1203 17:19:50.822725 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:51 crc kubenswrapper[4841]: I1203 17:19:51.009081 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c12abe1e-e6a0-4bed-9bab-feb7bf43622d","Type":"ContainerStarted","Data":"8f805dbc545ae176a0ea07c08d05763a84de876bada4ce73b7e57e832c85d24c"} Dec 03 17:19:51 crc kubenswrapper[4841]: I1203 17:19:51.907597 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:51 crc kubenswrapper[4841]: I1203 17:19:51.926492 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:52 crc kubenswrapper[4841]: I1203 17:19:52.021348 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c12abe1e-e6a0-4bed-9bab-feb7bf43622d","Type":"ContainerStarted","Data":"432f44be424c2c372410c7704cc0078e124169ec012e31c1057fb5bd8896c32f"} Dec 03 17:19:52 crc kubenswrapper[4841]: I1203 17:19:52.022310 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 17:19:52 crc kubenswrapper[4841]: I1203 17:19:52.041159 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.041139522 podStartE2EDuration="3.041139522s" podCreationTimestamp="2025-12-03 17:19:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:19:52.037283272 +0000 UTC m=+1186.424803999" watchObservedRunningTime="2025-12-03 17:19:52.041139522 +0000 UTC m=+1186.428660249" Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.344075 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.403374 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6fbf7d7cfc-n8r2b" Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.439316 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.439597 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="dnsmasq-dns" containerID="cri-o://646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6" gracePeriod=10 Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.494362 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.494583 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-774586549b-jbmpq" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-api" containerID="cri-o://4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e" gracePeriod=30 Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.495032 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-774586549b-jbmpq" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-httpd" containerID="cri-o://6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d" gracePeriod=30 Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.592317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.638603 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:53 crc kubenswrapper[4841]: I1203 17:19:53.999850 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.037540 4841 generic.go:334] "Generic (PLEG): container finished" podID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerID="6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d" exitCode=0 Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.037595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerDied","Data":"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d"} Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.039293 4841 generic.go:334] "Generic (PLEG): container finished" podID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerID="646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6" exitCode=0 Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040136 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040571 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" event={"ID":"f9bcc5c9-d525-4d1c-b74c-c6301b17967f","Type":"ContainerDied","Data":"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6"} Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-5jctc" event={"ID":"f9bcc5c9-d525-4d1c-b74c-c6301b17967f","Type":"ContainerDied","Data":"8812f7acf0803152e32800b26c4b69ec00b72790d7bf44abe6a191492096e180"} Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040610 4841 scope.go:117] "RemoveContainer" containerID="646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040769 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="cinder-scheduler" containerID="cri-o://51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814" gracePeriod=30 Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.040967 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="probe" containerID="cri-o://799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993" gracePeriod=30 Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.075574 4841 scope.go:117] "RemoveContainer" containerID="6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.094606 4841 scope.go:117] "RemoveContainer" containerID="646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6" Dec 03 17:19:54 crc kubenswrapper[4841]: E1203 17:19:54.095060 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6\": container with ID starting with 646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6 not found: ID does not exist" containerID="646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.095112 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6"} err="failed to get container status \"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6\": rpc error: code = NotFound desc = could not find container \"646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6\": container with ID starting with 646c5cd0e59a4dfb45fd254e7ea9c187afbfeb6ede995137eeae6885fa8cdbf6 not found: ID does not exist" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.095144 4841 scope.go:117] "RemoveContainer" containerID="6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74" Dec 03 17:19:54 crc kubenswrapper[4841]: E1203 17:19:54.096239 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74\": container with ID starting with 6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74 not found: ID does not exist" containerID="6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.096274 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74"} err="failed to get container status \"6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74\": rpc error: code = NotFound desc = could not find container \"6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74\": container with ID starting with 6cd2b657e500ac348a65c4633397e8adc08a48a51619393a63a52a6544bd8d74 not found: ID does not exist" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099146 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099205 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099341 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099373 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4jh2\" (UniqueName: \"kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099440 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.099474 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.108183 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2" (OuterVolumeSpecName: "kube-api-access-s4jh2") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "kube-api-access-s4jh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.161866 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config" (OuterVolumeSpecName: "config") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.185649 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.189030 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.205245 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: W1203 17:19:54.206752 4841 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f9bcc5c9-d525-4d1c-b74c-c6301b17967f/volumes/kubernetes.io~configmap/ovsdbserver-nb Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.206778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.206829 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") pod \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\" (UID: \"f9bcc5c9-d525-4d1c-b74c-c6301b17967f\") " Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.207798 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.207858 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.207869 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.207878 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4jh2\" (UniqueName: \"kubernetes.io/projected/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-kube-api-access-s4jh2\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.207886 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.218127 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f9bcc5c9-d525-4d1c-b74c-c6301b17967f" (UID: "f9bcc5c9-d525-4d1c-b74c-c6301b17967f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.309403 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f9bcc5c9-d525-4d1c-b74c-c6301b17967f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.360615 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:54 crc kubenswrapper[4841]: I1203 17:19:54.366677 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-5jctc"] Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.054315 4841 generic.go:334] "Generic (PLEG): container finished" podID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerID="799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993" exitCode=0 Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.054388 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerDied","Data":"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993"} Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.150772 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.268988 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-57c8cf4866-6qqks" Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.346151 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.351612 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cc745dffb-pksm6" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api-log" containerID="cri-o://888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc" gracePeriod=30 Dec 03 17:19:55 crc kubenswrapper[4841]: I1203 17:19:55.352292 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-7cc745dffb-pksm6" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api" containerID="cri-o://163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf" gracePeriod=30 Dec 03 17:19:56 crc kubenswrapper[4841]: I1203 17:19:56.071814 4841 generic.go:334] "Generic (PLEG): container finished" podID="28e39677-20d4-410e-80d6-8321877f674d" containerID="888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc" exitCode=143 Dec 03 17:19:56 crc kubenswrapper[4841]: I1203 17:19:56.072978 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerDied","Data":"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc"} Dec 03 17:19:56 crc kubenswrapper[4841]: I1203 17:19:56.253094 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" path="/var/lib/kubelet/pods/f9bcc5c9-d525-4d1c-b74c-c6301b17967f/volumes" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.057864 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.111402 4841 generic.go:334] "Generic (PLEG): container finished" podID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerID="51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814" exitCode=0 Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.111455 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerDied","Data":"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814"} Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.111494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"36bb6744-f5a9-43ec-98a8-762b5d2cb88d","Type":"ContainerDied","Data":"ad9f7f04fb9a63bb89cd07fcaf44d7f61e751c9375be5efcb70e68903849a581"} Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.111519 4841 scope.go:117] "RemoveContainer" containerID="799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.111697 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.138213 4841 scope.go:117] "RemoveContainer" containerID="51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.157408 4841 scope.go:117] "RemoveContainer" containerID="799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993" Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.157816 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993\": container with ID starting with 799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993 not found: ID does not exist" containerID="799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.157851 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993"} err="failed to get container status \"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993\": rpc error: code = NotFound desc = could not find container \"799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993\": container with ID starting with 799b481b0223808a588eac603ad3660a3a77e8afafd740527ec6a400b8ec5993 not found: ID does not exist" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.157889 4841 scope.go:117] "RemoveContainer" containerID="51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814" Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.158463 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814\": container with ID starting with 51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814 not found: ID does not exist" containerID="51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.158584 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814"} err="failed to get container status \"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814\": rpc error: code = NotFound desc = could not find container \"51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814\": container with ID starting with 51ab9df8f3a6e8a65df8df56cee5cd808f6b020872534f74e3f1a57ec779f814 not found: ID does not exist" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176414 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176582 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176772 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2fzkv\" (UniqueName: \"kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176897 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.176965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle\") pod \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\" (UID: \"36bb6744-f5a9-43ec-98a8-762b5d2cb88d\") " Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.178004 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.183872 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv" (OuterVolumeSpecName: "kube-api-access-2fzkv") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "kube-api-access-2fzkv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.183892 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts" (OuterVolumeSpecName: "scripts") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.184608 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.235418 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.269465 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data" (OuterVolumeSpecName: "config-data") pod "36bb6744-f5a9-43ec-98a8-762b5d2cb88d" (UID: "36bb6744-f5a9-43ec-98a8-762b5d2cb88d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279410 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279456 4841 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279475 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279498 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2fzkv\" (UniqueName: \"kubernetes.io/projected/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-kube-api-access-2fzkv\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279510 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.279521 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36bb6744-f5a9-43ec-98a8-762b5d2cb88d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.453573 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.481893 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.492310 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.492741 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="cinder-scheduler" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.492758 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="cinder-scheduler" Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.492775 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="dnsmasq-dns" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.492782 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="dnsmasq-dns" Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.492810 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="probe" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.492816 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="probe" Dec 03 17:19:57 crc kubenswrapper[4841]: E1203 17:19:57.492827 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="init" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.492833 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="init" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.493017 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9bcc5c9-d525-4d1c-b74c-c6301b17967f" containerName="dnsmasq-dns" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.493034 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="probe" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.493047 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" containerName="cinder-scheduler" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.493934 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.502465 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.518489 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.583880 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.584053 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.584175 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.584313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnf9p\" (UniqueName: \"kubernetes.io/projected/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-kube-api-access-gnf9p\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.584368 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.584384 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686409 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686479 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686564 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnf9p\" (UniqueName: \"kubernetes.io/projected/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-kube-api-access-gnf9p\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686670 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.686690 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.691846 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.692446 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.694558 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-scripts\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.704534 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-config-data\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.708209 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnf9p\" (UniqueName: \"kubernetes.io/projected/7b91ca91-fe2e-4f87-948f-f4db1f3a5854-kube-api-access-gnf9p\") pod \"cinder-scheduler-0\" (UID: \"7b91ca91-fe2e-4f87-948f-f4db1f3a5854\") " pod="openstack/cinder-scheduler-0" Dec 03 17:19:57 crc kubenswrapper[4841]: I1203 17:19:57.820745 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.120522 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 17:19:58 crc kubenswrapper[4841]: W1203 17:19:58.133964 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b91ca91_fe2e_4f87_948f_f4db1f3a5854.slice/crio-9e42ca15ec1c114af1a1d8c729fdea968c75b8916f79c7d120bfabaf7d1393ba WatchSource:0}: Error finding container 9e42ca15ec1c114af1a1d8c729fdea968c75b8916f79c7d120bfabaf7d1393ba: Status 404 returned error can't find the container with id 9e42ca15ec1c114af1a1d8c729fdea968c75b8916f79c7d120bfabaf7d1393ba Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.270757 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36bb6744-f5a9-43ec-98a8-762b5d2cb88d" path="/var/lib/kubelet/pods/36bb6744-f5a9-43ec-98a8-762b5d2cb88d/volumes" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.523253 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cc745dffb-pksm6" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:37462->10.217.0.157:9311: read: connection reset by peer" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.523313 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-7cc745dffb-pksm6" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.157:9311/healthcheck\": read tcp 10.217.0.2:37460->10.217.0.157:9311: read: connection reset by peer" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.833367 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.916875 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hrs2g\" (UniqueName: \"kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g\") pod \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.917220 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config\") pod \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.917661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle\") pod \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.917750 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config\") pod \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.917781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs\") pod \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\" (UID: \"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9\") " Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.925117 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g" (OuterVolumeSpecName: "kube-api-access-hrs2g") pod "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" (UID: "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9"). InnerVolumeSpecName "kube-api-access-hrs2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.925232 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" (UID: "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.968581 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.971354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" (UID: "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:58 crc kubenswrapper[4841]: I1203 17:19:58.971630 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config" (OuterVolumeSpecName: "config") pod "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" (UID: "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.005614 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" (UID: "7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.020012 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.020055 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.020072 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.020085 4841 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.020096 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hrs2g\" (UniqueName: \"kubernetes.io/projected/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9-kube-api-access-hrs2g\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.121737 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs\") pod \"28e39677-20d4-410e-80d6-8321877f674d\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.121956 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle\") pod \"28e39677-20d4-410e-80d6-8321877f674d\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.122062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom\") pod \"28e39677-20d4-410e-80d6-8321877f674d\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.122250 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data\") pod \"28e39677-20d4-410e-80d6-8321877f674d\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.122303 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6ql\" (UniqueName: \"kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql\") pod \"28e39677-20d4-410e-80d6-8321877f674d\" (UID: \"28e39677-20d4-410e-80d6-8321877f674d\") " Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.122321 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs" (OuterVolumeSpecName: "logs") pod "28e39677-20d4-410e-80d6-8321877f674d" (UID: "28e39677-20d4-410e-80d6-8321877f674d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.123245 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/28e39677-20d4-410e-80d6-8321877f674d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.125052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "28e39677-20d4-410e-80d6-8321877f674d" (UID: "28e39677-20d4-410e-80d6-8321877f674d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.125542 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql" (OuterVolumeSpecName: "kube-api-access-qj6ql") pod "28e39677-20d4-410e-80d6-8321877f674d" (UID: "28e39677-20d4-410e-80d6-8321877f674d"). InnerVolumeSpecName "kube-api-access-qj6ql". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.138817 4841 generic.go:334] "Generic (PLEG): container finished" podID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerID="4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e" exitCode=0 Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.138876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-774586549b-jbmpq" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.139032 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerDied","Data":"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.139100 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-774586549b-jbmpq" event={"ID":"7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9","Type":"ContainerDied","Data":"4274cdcce8c8c6ccb7641f9847dc315d12748a223a2bee7ae85f2442060208f5"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.139118 4841 scope.go:117] "RemoveContainer" containerID="6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.145841 4841 generic.go:334] "Generic (PLEG): container finished" podID="28e39677-20d4-410e-80d6-8321877f674d" containerID="163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf" exitCode=0 Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.145921 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7cc745dffb-pksm6" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.145931 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerDied","Data":"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.145981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7cc745dffb-pksm6" event={"ID":"28e39677-20d4-410e-80d6-8321877f674d","Type":"ContainerDied","Data":"6753657589eb205e8273e2833897b3b31f5ba6989801d0014e56e3387001a442"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.149019 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b91ca91-fe2e-4f87-948f-f4db1f3a5854","Type":"ContainerStarted","Data":"890e62fb176b284966cb214112b0b6b1c1b4db79c5fd6b2b14e725c66c659014"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.149068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b91ca91-fe2e-4f87-948f-f4db1f3a5854","Type":"ContainerStarted","Data":"9e42ca15ec1c114af1a1d8c729fdea968c75b8916f79c7d120bfabaf7d1393ba"} Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.154700 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "28e39677-20d4-410e-80d6-8321877f674d" (UID: "28e39677-20d4-410e-80d6-8321877f674d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.193022 4841 scope.go:117] "RemoveContainer" containerID="4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.194559 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data" (OuterVolumeSpecName: "config-data") pod "28e39677-20d4-410e-80d6-8321877f674d" (UID: "28e39677-20d4-410e-80d6-8321877f674d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.194466 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.204151 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-774586549b-jbmpq"] Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.217358 4841 scope.go:117] "RemoveContainer" containerID="6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d" Dec 03 17:19:59 crc kubenswrapper[4841]: E1203 17:19:59.217717 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d\": container with ID starting with 6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d not found: ID does not exist" containerID="6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.217763 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d"} err="failed to get container status \"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d\": rpc error: code = NotFound desc = could not find container \"6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d\": container with ID starting with 6d4e1c7007e85ff62c9d1f50a5d37d8b7dfc50d77dd4acb097d698ee4da4703d not found: ID does not exist" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.217787 4841 scope.go:117] "RemoveContainer" containerID="4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e" Dec 03 17:19:59 crc kubenswrapper[4841]: E1203 17:19:59.218081 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e\": container with ID starting with 4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e not found: ID does not exist" containerID="4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.218109 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e"} err="failed to get container status \"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e\": rpc error: code = NotFound desc = could not find container \"4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e\": container with ID starting with 4f14a4b3552e8378a090c69c07882b3bdb78bf2051476e8378fa58e88dcfaa7e not found: ID does not exist" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.218128 4841 scope.go:117] "RemoveContainer" containerID="163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.225059 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.225080 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6ql\" (UniqueName: \"kubernetes.io/projected/28e39677-20d4-410e-80d6-8321877f674d-kube-api-access-qj6ql\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.225090 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.225098 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/28e39677-20d4-410e-80d6-8321877f674d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.239391 4841 scope.go:117] "RemoveContainer" containerID="888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.257930 4841 scope.go:117] "RemoveContainer" containerID="163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf" Dec 03 17:19:59 crc kubenswrapper[4841]: E1203 17:19:59.259864 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf\": container with ID starting with 163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf not found: ID does not exist" containerID="163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.259896 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf"} err="failed to get container status \"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf\": rpc error: code = NotFound desc = could not find container \"163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf\": container with ID starting with 163f8604f814590d733bccb0afb6d8848dda99a4d4a9a8fdc483e9186061c7bf not found: ID does not exist" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.259933 4841 scope.go:117] "RemoveContainer" containerID="888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc" Dec 03 17:19:59 crc kubenswrapper[4841]: E1203 17:19:59.260129 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc\": container with ID starting with 888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc not found: ID does not exist" containerID="888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.260151 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc"} err="failed to get container status \"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc\": rpc error: code = NotFound desc = could not find container \"888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc\": container with ID starting with 888f42d6c493a7033faf3990f8e56f2017968ed3d0bf57eecb6d1b3a5c6706bc not found: ID does not exist" Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.491342 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:19:59 crc kubenswrapper[4841]: I1203 17:19:59.501871 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-7cc745dffb-pksm6"] Dec 03 17:20:00 crc kubenswrapper[4841]: I1203 17:20:00.161197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7b91ca91-fe2e-4f87-948f-f4db1f3a5854","Type":"ContainerStarted","Data":"feaba5d9ae67e4a9bb9f316e8c214861c2d88bfb500495d97c76954bbc6c5d7b"} Dec 03 17:20:00 crc kubenswrapper[4841]: I1203 17:20:00.183512 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.183491562 podStartE2EDuration="3.183491562s" podCreationTimestamp="2025-12-03 17:19:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:00.183458911 +0000 UTC m=+1194.570979668" watchObservedRunningTime="2025-12-03 17:20:00.183491562 +0000 UTC m=+1194.571012299" Dec 03 17:20:00 crc kubenswrapper[4841]: I1203 17:20:00.251428 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e39677-20d4-410e-80d6-8321877f674d" path="/var/lib/kubelet/pods/28e39677-20d4-410e-80d6-8321877f674d/volumes" Dec 03 17:20:00 crc kubenswrapper[4841]: I1203 17:20:00.252442 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" path="/var/lib/kubelet/pods/7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9/volumes" Dec 03 17:20:01 crc kubenswrapper[4841]: I1203 17:20:01.218551 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:20:01 crc kubenswrapper[4841]: I1203 17:20:01.220424 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-59984668d4-h88x4" Dec 03 17:20:01 crc kubenswrapper[4841]: I1203 17:20:01.271379 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 17:20:01 crc kubenswrapper[4841]: I1203 17:20:01.817244 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-59cdc4489f-kzmfh" Dec 03 17:20:02 crc kubenswrapper[4841]: I1203 17:20:02.821764 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.686601 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 17:20:03 crc kubenswrapper[4841]: E1203 17:20:03.687583 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api-log" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.687613 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api-log" Dec 03 17:20:03 crc kubenswrapper[4841]: E1203 17:20:03.687668 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-api" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.687687 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-api" Dec 03 17:20:03 crc kubenswrapper[4841]: E1203 17:20:03.687730 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.687750 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api" Dec 03 17:20:03 crc kubenswrapper[4841]: E1203 17:20:03.687787 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-httpd" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.687803 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-httpd" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.688231 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.688265 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e39677-20d4-410e-80d6-8321877f674d" containerName="barbican-api-log" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.688292 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-httpd" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.688323 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7751cb9c-f0c8-4db4-b53f-5ac0ae8f16a9" containerName="neutron-api" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.689287 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.694995 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.696311 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.696453 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.697988 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-48755" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.820827 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.820871 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.820930 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.821205 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgng\" (UniqueName: \"kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.923596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.924636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.924827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.925028 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgng\" (UniqueName: \"kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.925536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.933391 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.935663 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:03 crc kubenswrapper[4841]: I1203 17:20:03.955014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgng\" (UniqueName: \"kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng\") pod \"openstackclient\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " pod="openstack/openstackclient" Dec 03 17:20:04 crc kubenswrapper[4841]: I1203 17:20:04.009653 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:20:04 crc kubenswrapper[4841]: I1203 17:20:04.329468 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:20:04 crc kubenswrapper[4841]: W1203 17:20:04.349503 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod967e0e4b_7b01_435c_92b7_dedc9b63dc5c.slice/crio-5f1857f3250b0f90ffca5e1417d189da0790d76c7c3dc8de9cf233e47bb511c4 WatchSource:0}: Error finding container 5f1857f3250b0f90ffca5e1417d189da0790d76c7c3dc8de9cf233e47bb511c4: Status 404 returned error can't find the container with id 5f1857f3250b0f90ffca5e1417d189da0790d76c7c3dc8de9cf233e47bb511c4 Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.211150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"967e0e4b-7b01-435c-92b7-dedc9b63dc5c","Type":"ContainerStarted","Data":"5f1857f3250b0f90ffca5e1417d189da0790d76c7c3dc8de9cf233e47bb511c4"} Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.726189 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.728927 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.732126 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.732492 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-l5pst" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.732665 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.736003 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.860548 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.860843 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkvb6\" (UniqueName: \"kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.860875 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.860920 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.866466 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.867889 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.885938 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.911969 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.913402 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.918285 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.952110 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963772 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkvb6\" (UniqueName: \"kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963817 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963920 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963941 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963961 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkd9c\" (UniqueName: \"kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.963987 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964037 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964061 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964088 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4jnb\" (UniqueName: \"kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964130 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.964146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.975184 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.977288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.977360 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.977408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.981173 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.982591 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkvb6\" (UniqueName: \"kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:05 crc kubenswrapper[4841]: I1203 17:20:05.994806 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle\") pod \"heat-engine-7848f458c5-zfrpj\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.006112 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066320 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066376 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066410 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066441 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4jnb\" (UniqueName: \"kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066463 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066485 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066501 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066741 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5dnb\" (UniqueName: \"kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066856 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066918 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.066954 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkd9c\" (UniqueName: \"kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.067010 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.067155 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.067291 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.067309 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.070124 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.070290 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.070337 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.070524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.070821 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.073630 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.080471 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.090633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkd9c\" (UniqueName: \"kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c\") pod \"heat-cfnapi-58569788fd-b9rl2\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.091796 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4jnb\" (UniqueName: \"kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb\") pod \"dnsmasq-dns-7756b9d78c-dhr82\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.168317 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.168401 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5dnb\" (UniqueName: \"kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.168501 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.168546 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.174293 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.176123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.179014 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.187709 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.189556 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5dnb\" (UniqueName: \"kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb\") pod \"heat-api-f96469445-n4lln\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.249996 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.351950 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.609364 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.756713 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.841274 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:06 crc kubenswrapper[4841]: I1203 17:20:06.908089 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:06 crc kubenswrapper[4841]: W1203 17:20:06.930614 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcdb5b46f_ef21_412d_a1de_f8bf0dd8ef12.slice/crio-2618837582b42dad8b4e8e3c6c4aad007beb679230e036900fb7e6dbc452dbcc WatchSource:0}: Error finding container 2618837582b42dad8b4e8e3c6c4aad007beb679230e036900fb7e6dbc452dbcc: Status 404 returned error can't find the container with id 2618837582b42dad8b4e8e3c6c4aad007beb679230e036900fb7e6dbc452dbcc Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.277389 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f96469445-n4lln" event={"ID":"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a","Type":"ContainerStarted","Data":"8f17a10e681e59970b0801f9a1bd765af548d02d0c0d0261f810da992b0634ce"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.280517 4841 generic.go:334] "Generic (PLEG): container finished" podID="575f396c-3448-4351-867c-54ac4b0b211f" containerID="8ace8f01fa94626762f0165ad54fdf1464f92dc65b9896b257c24a33e9181469" exitCode=0 Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.280578 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" event={"ID":"575f396c-3448-4351-867c-54ac4b0b211f","Type":"ContainerDied","Data":"8ace8f01fa94626762f0165ad54fdf1464f92dc65b9896b257c24a33e9181469"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.280595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" event={"ID":"575f396c-3448-4351-867c-54ac4b0b211f","Type":"ContainerStarted","Data":"ae970e292ba5e9e7f902921d0de8717ffa3233b24373a41f8d0223f514ed1e92"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.295152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58569788fd-b9rl2" event={"ID":"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12","Type":"ContainerStarted","Data":"2618837582b42dad8b4e8e3c6c4aad007beb679230e036900fb7e6dbc452dbcc"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.302947 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7848f458c5-zfrpj" event={"ID":"702ea708-3829-4dcb-9cee-8db1f6fbb715","Type":"ContainerStarted","Data":"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.303000 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7848f458c5-zfrpj" event={"ID":"702ea708-3829-4dcb-9cee-8db1f6fbb715","Type":"ContainerStarted","Data":"3d38c95eea83c06968d8eeac0aceede794f38fa4b127d0d3621a6a3cd1faa11c"} Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.303222 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:07 crc kubenswrapper[4841]: I1203 17:20:07.335364 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-7848f458c5-zfrpj" podStartSLOduration=2.335339838 podStartE2EDuration="2.335339838s" podCreationTimestamp="2025-12-03 17:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:07.327249869 +0000 UTC m=+1201.714770596" watchObservedRunningTime="2025-12-03 17:20:07.335339838 +0000 UTC m=+1201.722860565" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.042308 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.227370 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6df7dcffd7-hvqxv"] Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.229320 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.231939 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.232321 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.233294 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.234538 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6df7dcffd7-hvqxv"] Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.319734 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-run-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.319830 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-config-data\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.319943 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-public-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.320604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-combined-ca-bundle\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.320643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-internal-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.320658 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vn7wm\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-kube-api-access-vn7wm\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.320697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-log-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.320755 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-etc-swift\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.347273 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" event={"ID":"575f396c-3448-4351-867c-54ac4b0b211f","Type":"ContainerStarted","Data":"62bdce21b8c15e4288d57a63b707dae053dd42cf9c1507221681f5355b57b67d"} Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.347328 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.371473 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" podStartSLOduration=3.371457278 podStartE2EDuration="3.371457278s" podCreationTimestamp="2025-12-03 17:20:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:08.368953259 +0000 UTC m=+1202.756473986" watchObservedRunningTime="2025-12-03 17:20:08.371457278 +0000 UTC m=+1202.758977995" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423199 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-config-data\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-public-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423396 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-combined-ca-bundle\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423418 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-internal-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423433 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vn7wm\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-kube-api-access-vn7wm\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-log-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423504 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-etc-swift\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423521 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-run-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.423942 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-run-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.427560 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8efd46ec-9481-40f8-be85-637ddafc2291-log-httpd\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.438385 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-combined-ca-bundle\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.438708 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-internal-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.444080 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-public-tls-certs\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.445519 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-etc-swift\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.447584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8efd46ec-9481-40f8-be85-637ddafc2291-config-data\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.448171 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vn7wm\" (UniqueName: \"kubernetes.io/projected/8efd46ec-9481-40f8-be85-637ddafc2291-kube-api-access-vn7wm\") pod \"swift-proxy-6df7dcffd7-hvqxv\" (UID: \"8efd46ec-9481-40f8-be85-637ddafc2291\") " pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:08 crc kubenswrapper[4841]: I1203 17:20:08.545081 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.317149 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.317434 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.317483 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.318194 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.318244 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d" gracePeriod=600 Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.365771 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58569788fd-b9rl2" event={"ID":"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12","Type":"ContainerStarted","Data":"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72"} Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.367553 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.373037 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f96469445-n4lln" event={"ID":"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a","Type":"ContainerStarted","Data":"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a"} Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.373105 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.399327 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-58569788fd-b9rl2" podStartSLOduration=2.264670678 podStartE2EDuration="4.399087478s" podCreationTimestamp="2025-12-03 17:20:05 +0000 UTC" firstStartedPulling="2025-12-03 17:20:06.933121608 +0000 UTC m=+1201.320642335" lastFinishedPulling="2025-12-03 17:20:09.067538418 +0000 UTC m=+1203.455059135" observedRunningTime="2025-12-03 17:20:09.38461238 +0000 UTC m=+1203.772133097" watchObservedRunningTime="2025-12-03 17:20:09.399087478 +0000 UTC m=+1203.786608205" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.432027 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-f96469445-n4lln" podStartSLOduration=2.237171225 podStartE2EDuration="4.432009476s" podCreationTimestamp="2025-12-03 17:20:05 +0000 UTC" firstStartedPulling="2025-12-03 17:20:06.86339835 +0000 UTC m=+1201.250919067" lastFinishedPulling="2025-12-03 17:20:09.058236591 +0000 UTC m=+1203.445757318" observedRunningTime="2025-12-03 17:20:09.419185697 +0000 UTC m=+1203.806706424" watchObservedRunningTime="2025-12-03 17:20:09.432009476 +0000 UTC m=+1203.819530203" Dec 03 17:20:09 crc kubenswrapper[4841]: I1203 17:20:09.615648 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6df7dcffd7-hvqxv"] Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.395184 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" event={"ID":"8efd46ec-9481-40f8-be85-637ddafc2291","Type":"ContainerStarted","Data":"4a30785e80af3a8919fec15d64d14bdd045dbe24db461d4280686070ec781d59"} Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.395715 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" event={"ID":"8efd46ec-9481-40f8-be85-637ddafc2291","Type":"ContainerStarted","Data":"c6248b73d0e372afddd7b6843144d1ff8ae3866e3629e5b04259dc3ddc9b9707"} Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.395730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" event={"ID":"8efd46ec-9481-40f8-be85-637ddafc2291","Type":"ContainerStarted","Data":"39cb390b0c0f85e25eb16f953e7412a6ab2fd095b49ac99cd124c70f25184f10"} Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.395761 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.395772 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.398385 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d" exitCode=0 Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.399529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d"} Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.399557 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5"} Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.399574 4841 scope.go:117] "RemoveContainer" containerID="b3e3ec18aa928c5194a578236f76747e824d216c75a2a957951a1e3726f7b86a" Dec 03 17:20:10 crc kubenswrapper[4841]: I1203 17:20:10.445954 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" podStartSLOduration=2.445938288 podStartE2EDuration="2.445938288s" podCreationTimestamp="2025-12-03 17:20:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:10.420213507 +0000 UTC m=+1204.807734234" watchObservedRunningTime="2025-12-03 17:20:10.445938288 +0000 UTC m=+1204.833459015" Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.743035 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.743700 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-central-agent" containerID="cri-o://a411dfe32aa66ef5ebc2bc6688c83ea76a3226d765da217a8342bc9ca163bc86" gracePeriod=30 Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.743822 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" containerID="cri-o://486365a09e373d1ef8a8361fd6355529f52a5a366911d17b2c9deed60994958b" gracePeriod=30 Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.743864 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="sg-core" containerID="cri-o://8572b36aa5f483884060c16bc9e294bb2caa6c20893fea4c9e2e9252200236cb" gracePeriod=30 Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.743895 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-notification-agent" containerID="cri-o://eb4a8452ed0f0f33acb38d03f13f2051ccb657dabc84d31ca6a34e54df5ff2cc" gracePeriod=30 Dec 03 17:20:11 crc kubenswrapper[4841]: I1203 17:20:11.751022 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": read tcp 10.217.0.2:53704->10.217.0.162:3000: read: connection reset by peer" Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.435870 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4fee296-7753-4460-82f5-45df436f475d" containerID="486365a09e373d1ef8a8361fd6355529f52a5a366911d17b2c9deed60994958b" exitCode=0 Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.435925 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4fee296-7753-4460-82f5-45df436f475d" containerID="8572b36aa5f483884060c16bc9e294bb2caa6c20893fea4c9e2e9252200236cb" exitCode=2 Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.435937 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4fee296-7753-4460-82f5-45df436f475d" containerID="a411dfe32aa66ef5ebc2bc6688c83ea76a3226d765da217a8342bc9ca163bc86" exitCode=0 Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.435949 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerDied","Data":"486365a09e373d1ef8a8361fd6355529f52a5a366911d17b2c9deed60994958b"} Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.435995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerDied","Data":"8572b36aa5f483884060c16bc9e294bb2caa6c20893fea4c9e2e9252200236cb"} Dec 03 17:20:12 crc kubenswrapper[4841]: I1203 17:20:12.436005 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerDied","Data":"a411dfe32aa66ef5ebc2bc6688c83ea76a3226d765da217a8342bc9ca163bc86"} Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.013532 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-555b568f78-v86bh"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.016753 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.034980 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.036602 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.049877 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.051188 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.056231 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.063940 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-555b568f78-v86bh"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.083959 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111732 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111781 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data-custom\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111814 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111839 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111862 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcj5r\" (UniqueName: \"kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111970 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.111995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.112014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxpt7\" (UniqueName: \"kubernetes.io/projected/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-kube-api-access-gxpt7\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.112029 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpgpg\" (UniqueName: \"kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.112064 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.112087 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-combined-ca-bundle\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212761 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data-custom\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcj5r\" (UniqueName: \"kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212920 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212969 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.212987 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.213004 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxpt7\" (UniqueName: \"kubernetes.io/projected/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-kube-api-access-gxpt7\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.213018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpgpg\" (UniqueName: \"kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.213049 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.213071 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-combined-ca-bundle\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.227794 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-combined-ca-bundle\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.229735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.231008 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.232112 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data-custom\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.233384 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.233603 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.234334 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.234599 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxpt7\" (UniqueName: \"kubernetes.io/projected/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-kube-api-access-gxpt7\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.236308 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcj5r\" (UniqueName: \"kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r\") pod \"heat-api-7dd5698d74-2nhcr\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.241060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.244220 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d-config-data\") pod \"heat-engine-555b568f78-v86bh\" (UID: \"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d\") " pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.244965 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpgpg\" (UniqueName: \"kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg\") pod \"heat-cfnapi-6fd49975b6-6x566\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.349114 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.355088 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.375858 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.459553 4841 generic.go:334] "Generic (PLEG): container finished" podID="f4fee296-7753-4460-82f5-45df436f475d" containerID="eb4a8452ed0f0f33acb38d03f13f2051ccb657dabc84d31ca6a34e54df5ff2cc" exitCode=0 Dec 03 17:20:13 crc kubenswrapper[4841]: I1203 17:20:13.459595 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerDied","Data":"eb4a8452ed0f0f33acb38d03f13f2051ccb657dabc84d31ca6a34e54df5ff2cc"} Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.234715 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.162:3000/\": dial tcp 10.217.0.162:3000: connect: connection refused" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.348731 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.348973 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-58569788fd-b9rl2" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" containerID="cri-o://11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72" gracePeriod=60 Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.358485 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.358702 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-f96469445-n4lln" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" containerID="cri-o://64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a" gracePeriod=60 Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.362387 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58569788fd-b9rl2" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.168:8000/healthcheck\": EOF" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.364294 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-f96469445-n4lln" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.169:8004/healthcheck\": EOF" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.382771 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-c5f5d9bb6-ddbgn"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.384395 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.386098 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.386252 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.400316 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-799f468d8f-qbwl4"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.401885 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.404137 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.404362 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.428826 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-799f468d8f-qbwl4"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.449215 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-public-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8f8c\" (UniqueName: \"kubernetes.io/projected/839b1e72-e619-4c4c-80ac-0754251beb2a-kube-api-access-c8f8c\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450599 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-internal-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450632 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-public-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450656 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-combined-ca-bundle\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450749 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-internal-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450785 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-combined-ca-bundle\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwsdg\" (UniqueName: \"kubernetes.io/projected/bfc25bf9-7fd7-4f92-9dbb-61f291592975-kube-api-access-pwsdg\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450871 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450920 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data-custom\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450969 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data-custom\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.450984 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.483390 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c5f5d9bb6-ddbgn"] Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552419 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data-custom\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552458 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data-custom\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552477 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-public-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552541 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8f8c\" (UniqueName: \"kubernetes.io/projected/839b1e72-e619-4c4c-80ac-0754251beb2a-kube-api-access-c8f8c\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552570 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-internal-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-public-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552616 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-combined-ca-bundle\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552660 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-internal-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552688 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-combined-ca-bundle\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.552711 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwsdg\" (UniqueName: \"kubernetes.io/projected/bfc25bf9-7fd7-4f92-9dbb-61f291592975-kube-api-access-pwsdg\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.559945 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data-custom\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.560301 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-config-data\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.560505 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-public-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.561268 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.568625 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-config-data-custom\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.576005 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-internal-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.576196 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-public-tls-certs\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.576259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-internal-tls-certs\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.576316 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/839b1e72-e619-4c4c-80ac-0754251beb2a-combined-ca-bundle\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.576592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfc25bf9-7fd7-4f92-9dbb-61f291592975-combined-ca-bundle\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.579425 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwsdg\" (UniqueName: \"kubernetes.io/projected/bfc25bf9-7fd7-4f92-9dbb-61f291592975-kube-api-access-pwsdg\") pod \"heat-cfnapi-c5f5d9bb6-ddbgn\" (UID: \"bfc25bf9-7fd7-4f92-9dbb-61f291592975\") " pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.579487 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8f8c\" (UniqueName: \"kubernetes.io/projected/839b1e72-e619-4c4c-80ac-0754251beb2a-kube-api-access-c8f8c\") pod \"heat-api-799f468d8f-qbwl4\" (UID: \"839b1e72-e619-4c4c-80ac-0754251beb2a\") " pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.713530 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:14 crc kubenswrapper[4841]: I1203 17:20:14.732314 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.117341 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.189048 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.260678 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.261224 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="dnsmasq-dns" containerID="cri-o://943505f1a01545930727cd0e63319a51aa58e4fc1a2b5576fecfbc66e542f176" gracePeriod=10 Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.493372 4841 generic.go:334] "Generic (PLEG): container finished" podID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerID="943505f1a01545930727cd0e63319a51aa58e4fc1a2b5576fecfbc66e542f176" exitCode=0 Dec 03 17:20:16 crc kubenswrapper[4841]: I1203 17:20:16.493448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" event={"ID":"d07669f7-1f8e-4e31-96a6-f3f3fcea5315","Type":"ContainerDied","Data":"943505f1a01545930727cd0e63319a51aa58e4fc1a2b5576fecfbc66e542f176"} Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.369480 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400104 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400307 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400336 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj9sp\" (UniqueName: \"kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400382 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400428 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.400452 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb\") pod \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\" (UID: \"d07669f7-1f8e-4e31-96a6-f3f3fcea5315\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.417082 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp" (OuterVolumeSpecName: "kube-api-access-dj9sp") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "kube-api-access-dj9sp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.465532 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.483297 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.484529 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.490718 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.506777 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.506875 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.506940 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507006 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507058 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507087 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phr8z\" (UniqueName: \"kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507194 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle\") pod \"f4fee296-7753-4460-82f5-45df436f475d\" (UID: \"f4fee296-7753-4460-82f5-45df436f475d\") " Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507569 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507589 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507599 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.507610 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj9sp\" (UniqueName: \"kubernetes.io/projected/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-kube-api-access-dj9sp\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.508994 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config" (OuterVolumeSpecName: "config") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.509423 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.513525 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.516896 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z" (OuterVolumeSpecName: "kube-api-access-phr8z") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "kube-api-access-phr8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.544406 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d07669f7-1f8e-4e31-96a6-f3f3fcea5315" (UID: "d07669f7-1f8e-4e31-96a6-f3f3fcea5315"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.551648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" event={"ID":"d07669f7-1f8e-4e31-96a6-f3f3fcea5315","Type":"ContainerDied","Data":"a28a83f5c5f1491ff01d82e508a3d0b7b9af421e4c6f89b669eb6d6ef8e66bba"} Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.551702 4841 scope.go:117] "RemoveContainer" containerID="943505f1a01545930727cd0e63319a51aa58e4fc1a2b5576fecfbc66e542f176" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.551801 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-6xl2c" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.557049 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts" (OuterVolumeSpecName: "scripts") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.583265 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f4fee296-7753-4460-82f5-45df436f475d","Type":"ContainerDied","Data":"ac12bd895dffa53448fdef84a9d94b6cb8adb9110a74c6f790502c0418eabe53"} Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.583502 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.614829 4841 scope.go:117] "RemoveContainer" containerID="b745da054c75ee382772c823e2a56ac2b9387306d6ab5ebbe652542183d563ef" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615503 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phr8z\" (UniqueName: \"kubernetes.io/projected/f4fee296-7753-4460-82f5-45df436f475d-kube-api-access-phr8z\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615599 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615666 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615718 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615768 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f4fee296-7753-4460-82f5-45df436f475d-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.615862 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d07669f7-1f8e-4e31-96a6-f3f3fcea5315-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.623613 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.644366 4841 scope.go:117] "RemoveContainer" containerID="486365a09e373d1ef8a8361fd6355529f52a5a366911d17b2c9deed60994958b" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.660126 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.667723 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.676516 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-6xl2c"] Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.681970 4841 scope.go:117] "RemoveContainer" containerID="8572b36aa5f483884060c16bc9e294bb2caa6c20893fea4c9e2e9252200236cb" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.700538 4841 scope.go:117] "RemoveContainer" containerID="eb4a8452ed0f0f33acb38d03f13f2051ccb657dabc84d31ca6a34e54df5ff2cc" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.720775 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.720808 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.725739 4841 scope.go:117] "RemoveContainer" containerID="a411dfe32aa66ef5ebc2bc6688c83ea76a3226d765da217a8342bc9ca163bc86" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.730203 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data" (OuterVolumeSpecName: "config-data") pod "f4fee296-7753-4460-82f5-45df436f475d" (UID: "f4fee296-7753-4460-82f5-45df436f475d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.822232 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f4fee296-7753-4460-82f5-45df436f475d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:17 crc kubenswrapper[4841]: W1203 17:20:17.865260 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod839b1e72_e619_4c4c_80ac_0754251beb2a.slice/crio-b07186d6e028a9be76b8766b4028df6d64c8b2e5c40a3f5264f30393de7f472f WatchSource:0}: Error finding container b07186d6e028a9be76b8766b4028df6d64c8b2e5c40a3f5264f30393de7f472f: Status 404 returned error can't find the container with id b07186d6e028a9be76b8766b4028df6d64c8b2e5c40a3f5264f30393de7f472f Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.866639 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-799f468d8f-qbwl4"] Dec 03 17:20:17 crc kubenswrapper[4841]: I1203 17:20:17.946024 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-c5f5d9bb6-ddbgn"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.011231 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-555b568f78-v86bh"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.020550 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.028066 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.036952 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.046862 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054162 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054549 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-notification-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054568 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-notification-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054591 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="dnsmasq-dns" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054598 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="dnsmasq-dns" Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054618 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="sg-core" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054624 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="sg-core" Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054639 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-central-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-central-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054658 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054664 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" Dec 03 17:20:18 crc kubenswrapper[4841]: E1203 17:20:18.054676 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="init" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054685 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="init" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054971 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-notification-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.054989 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="ceilometer-central-agent" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.055002 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="sg-core" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.055012 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" containerName="dnsmasq-dns" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.055023 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4fee296-7753-4460-82f5-45df436f475d" containerName="proxy-httpd" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.058017 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.062296 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.062447 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.063349 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231159 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7n6f\" (UniqueName: \"kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231188 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231260 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.231313 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.253107 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d07669f7-1f8e-4e31-96a6-f3f3fcea5315" path="/var/lib/kubelet/pods/d07669f7-1f8e-4e31-96a6-f3f3fcea5315/volumes" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.253878 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4fee296-7753-4460-82f5-45df436f475d" path="/var/lib/kubelet/pods/f4fee296-7753-4460-82f5-45df436f475d/volumes" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.332809 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.332918 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.332953 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.332989 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.333014 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.333078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.333124 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7n6f\" (UniqueName: \"kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.333568 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.333631 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.338129 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.340372 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.340548 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.345526 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.351673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7n6f\" (UniqueName: \"kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f\") pod \"ceilometer-0\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.386269 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.554965 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.559558 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6df7dcffd7-hvqxv" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.602144 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" event={"ID":"bfc25bf9-7fd7-4f92-9dbb-61f291592975","Type":"ContainerStarted","Data":"58ec5c3b6f5d192569abc40c9276fb4b3bd9a639456ba72e5e1f17d7554a260a"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.602189 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" event={"ID":"bfc25bf9-7fd7-4f92-9dbb-61f291592975","Type":"ContainerStarted","Data":"9b7ceba03ffcadb7aaf70386d3e1bbb44439fe0930246c7c234850ae7b48ec13"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.602456 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.610321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dd5698d74-2nhcr" event={"ID":"0ada9b18-f961-4399-a186-a9ccb498d527","Type":"ContainerStarted","Data":"c7ab09e943b6d072659878d0bfaeee44c31b738d3fc855ff7338402a7e1f7cf8"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.610361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dd5698d74-2nhcr" event={"ID":"0ada9b18-f961-4399-a186-a9ccb498d527","Type":"ContainerStarted","Data":"73b8b146031cb5422856aab71b62ed4be9a1f94ec9c76a462b53803a1062b701"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.611359 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.619556 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799f468d8f-qbwl4" event={"ID":"839b1e72-e619-4c4c-80ac-0754251beb2a","Type":"ContainerStarted","Data":"0c555fdf91fac4852aaaa1e2ca25de7f43df7d5c61605262b3e6534e45db661b"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.619599 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-799f468d8f-qbwl4" event={"ID":"839b1e72-e619-4c4c-80ac-0754251beb2a","Type":"ContainerStarted","Data":"b07186d6e028a9be76b8766b4028df6d64c8b2e5c40a3f5264f30393de7f472f"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.620478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.627067 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" podStartSLOduration=4.627047263 podStartE2EDuration="4.627047263s" podCreationTimestamp="2025-12-03 17:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:18.626666114 +0000 UTC m=+1213.014186831" watchObservedRunningTime="2025-12-03 17:20:18.627047263 +0000 UTC m=+1213.014567990" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.653806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fd49975b6-6x566" event={"ID":"79950490-c5c5-4db6-9121-220947907f6f","Type":"ContainerStarted","Data":"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.653850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fd49975b6-6x566" event={"ID":"79950490-c5c5-4db6-9121-220947907f6f","Type":"ContainerStarted","Data":"11269c1c7e1654aa5196450fa5a5a2052a8c2436ad0985d456f10752d9a6b094"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.654452 4841 scope.go:117] "RemoveContainer" containerID="7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.684278 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-799f468d8f-qbwl4" podStartSLOduration=4.684254949 podStartE2EDuration="4.684254949s" podCreationTimestamp="2025-12-03 17:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:18.653435589 +0000 UTC m=+1213.040956316" watchObservedRunningTime="2025-12-03 17:20:18.684254949 +0000 UTC m=+1213.071775666" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.691840 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-555b568f78-v86bh" event={"ID":"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d","Type":"ContainerStarted","Data":"1551f4a7d5b9b83643d931e6651828a02038b134f3c58eddcd6bc92d86a0cdde"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.691881 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-555b568f78-v86bh" event={"ID":"8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d","Type":"ContainerStarted","Data":"8a2233a71c45010c1b5020d09c1bfa41783d0405fa377158aee781c3746ebb31"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.692680 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.703018 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"967e0e4b-7b01-435c-92b7-dedc9b63dc5c","Type":"ContainerStarted","Data":"6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082"} Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.710359 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-7dd5698d74-2nhcr" podStartSLOduration=6.710338948 podStartE2EDuration="6.710338948s" podCreationTimestamp="2025-12-03 17:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:18.677561142 +0000 UTC m=+1213.065081869" watchObservedRunningTime="2025-12-03 17:20:18.710338948 +0000 UTC m=+1213.097859675" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.742826 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-555b568f78-v86bh" podStartSLOduration=6.742804446 podStartE2EDuration="6.742804446s" podCreationTimestamp="2025-12-03 17:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:18.719201595 +0000 UTC m=+1213.106722322" watchObservedRunningTime="2025-12-03 17:20:18.742804446 +0000 UTC m=+1213.130325173" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.763384 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.9013275 podStartE2EDuration="15.763357105s" podCreationTimestamp="2025-12-03 17:20:03 +0000 UTC" firstStartedPulling="2025-12-03 17:20:04.352955662 +0000 UTC m=+1198.740476389" lastFinishedPulling="2025-12-03 17:20:17.214985267 +0000 UTC m=+1211.602505994" observedRunningTime="2025-12-03 17:20:18.737616645 +0000 UTC m=+1213.125137372" watchObservedRunningTime="2025-12-03 17:20:18.763357105 +0000 UTC m=+1213.150877832" Dec 03 17:20:18 crc kubenswrapper[4841]: I1203 17:20:18.829399 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.723717 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ada9b18-f961-4399-a186-a9ccb498d527" containerID="c7ab09e943b6d072659878d0bfaeee44c31b738d3fc855ff7338402a7e1f7cf8" exitCode=1 Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.724042 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dd5698d74-2nhcr" event={"ID":"0ada9b18-f961-4399-a186-a9ccb498d527","Type":"ContainerDied","Data":"c7ab09e943b6d072659878d0bfaeee44c31b738d3fc855ff7338402a7e1f7cf8"} Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.724702 4841 scope.go:117] "RemoveContainer" containerID="c7ab09e943b6d072659878d0bfaeee44c31b738d3fc855ff7338402a7e1f7cf8" Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.731806 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerStarted","Data":"0c7a044aff2600adc7b75d726f9e831a690bf01aff1044c96cb02833e98d8067"} Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.741462 4841 generic.go:334] "Generic (PLEG): container finished" podID="79950490-c5c5-4db6-9121-220947907f6f" containerID="7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac" exitCode=1 Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.741496 4841 generic.go:334] "Generic (PLEG): container finished" podID="79950490-c5c5-4db6-9121-220947907f6f" containerID="933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad" exitCode=1 Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.743953 4841 scope.go:117] "RemoveContainer" containerID="933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad" Dec 03 17:20:19 crc kubenswrapper[4841]: E1203 17:20:19.744364 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6fd49975b6-6x566_openstack(79950490-c5c5-4db6-9121-220947907f6f)\"" pod="openstack/heat-cfnapi-6fd49975b6-6x566" podUID="79950490-c5c5-4db6-9121-220947907f6f" Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.744404 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fd49975b6-6x566" event={"ID":"79950490-c5c5-4db6-9121-220947907f6f","Type":"ContainerDied","Data":"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac"} Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.744431 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fd49975b6-6x566" event={"ID":"79950490-c5c5-4db6-9121-220947907f6f","Type":"ContainerDied","Data":"933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad"} Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.744775 4841 scope.go:117] "RemoveContainer" containerID="7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac" Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.747489 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-58569788fd-b9rl2" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.168:8000/healthcheck\": read tcp 10.217.0.2:39208->10.217.0.168:8000: read: connection reset by peer" Dec 03 17:20:19 crc kubenswrapper[4841]: I1203 17:20:19.776369 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-f96469445-n4lln" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.169:8004/healthcheck\": read tcp 10.217.0.2:33376->10.217.0.169:8004: read: connection reset by peer" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.032487 4841 scope.go:117] "RemoveContainer" containerID="7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac" Dec 03 17:20:20 crc kubenswrapper[4841]: E1203 17:20:20.033115 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac\": container with ID starting with 7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac not found: ID does not exist" containerID="7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.033164 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac"} err="failed to get container status \"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac\": rpc error: code = NotFound desc = could not find container \"7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac\": container with ID starting with 7576049ceb9b2bcb1b1636b71a74df5ba43f06b74f9eb88e7318a534c5abecac not found: ID does not exist" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.179245 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.261599 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.300877 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkd9c\" (UniqueName: \"kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c\") pod \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.300968 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data\") pod \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.300987 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom\") pod \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.301020 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom\") pod \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.301067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle\") pod \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\" (UID: \"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.301091 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data\") pod \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.301121 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5dnb\" (UniqueName: \"kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb\") pod \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.301163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle\") pod \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\" (UID: \"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a\") " Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.304448 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c" (OuterVolumeSpecName: "kube-api-access-jkd9c") pod "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" (UID: "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12"). InnerVolumeSpecName "kube-api-access-jkd9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.310113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb" (OuterVolumeSpecName: "kube-api-access-d5dnb") pod "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" (UID: "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a"). InnerVolumeSpecName "kube-api-access-d5dnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.310216 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" (UID: "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.313067 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" (UID: "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.340167 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" (UID: "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.383793 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" (UID: "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.387523 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data" (OuterVolumeSpecName: "config-data") pod "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" (UID: "cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406396 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5dnb\" (UniqueName: \"kubernetes.io/projected/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-kube-api-access-d5dnb\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406427 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406439 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkd9c\" (UniqueName: \"kubernetes.io/projected/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-kube-api-access-jkd9c\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406448 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406457 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406466 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.406475 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.420153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data" (OuterVolumeSpecName: "config-data") pod "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" (UID: "0cdefc3b-a82b-4cff-9aca-d0dccec39e3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.508218 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.753799 4841 generic.go:334] "Generic (PLEG): container finished" podID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerID="64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a" exitCode=0 Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.753876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f96469445-n4lln" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.753917 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f96469445-n4lln" event={"ID":"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a","Type":"ContainerDied","Data":"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.754230 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f96469445-n4lln" event={"ID":"0cdefc3b-a82b-4cff-9aca-d0dccec39e3a","Type":"ContainerDied","Data":"8f17a10e681e59970b0801f9a1bd765af548d02d0c0d0261f810da992b0634ce"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.754254 4841 scope.go:117] "RemoveContainer" containerID="64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.760696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerStarted","Data":"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.760736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerStarted","Data":"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.763739 4841 generic.go:334] "Generic (PLEG): container finished" podID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerID="11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72" exitCode=0 Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.763817 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58569788fd-b9rl2" event={"ID":"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12","Type":"ContainerDied","Data":"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.763850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-58569788fd-b9rl2" event={"ID":"cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12","Type":"ContainerDied","Data":"2618837582b42dad8b4e8e3c6c4aad007beb679230e036900fb7e6dbc452dbcc"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.763922 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-58569788fd-b9rl2" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.766532 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ada9b18-f961-4399-a186-a9ccb498d527" containerID="5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0" exitCode=1 Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.766591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dd5698d74-2nhcr" event={"ID":"0ada9b18-f961-4399-a186-a9ccb498d527","Type":"ContainerDied","Data":"5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0"} Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.767162 4841 scope.go:117] "RemoveContainer" containerID="5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0" Dec 03 17:20:20 crc kubenswrapper[4841]: E1203 17:20:20.767442 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dd5698d74-2nhcr_openstack(0ada9b18-f961-4399-a186-a9ccb498d527)\"" pod="openstack/heat-api-7dd5698d74-2nhcr" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.775634 4841 scope.go:117] "RemoveContainer" containerID="933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad" Dec 03 17:20:20 crc kubenswrapper[4841]: E1203 17:20:20.775963 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6fd49975b6-6x566_openstack(79950490-c5c5-4db6-9121-220947907f6f)\"" pod="openstack/heat-cfnapi-6fd49975b6-6x566" podUID="79950490-c5c5-4db6-9121-220947907f6f" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.805041 4841 scope.go:117] "RemoveContainer" containerID="64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a" Dec 03 17:20:20 crc kubenswrapper[4841]: E1203 17:20:20.806063 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a\": container with ID starting with 64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a not found: ID does not exist" containerID="64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.806197 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a"} err="failed to get container status \"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a\": rpc error: code = NotFound desc = could not find container \"64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a\": container with ID starting with 64eb5657d212a601d2de032806fe57959978072d0431123db71fc9fb2a901d1a not found: ID does not exist" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.806285 4841 scope.go:117] "RemoveContainer" containerID="11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.833718 4841 scope.go:117] "RemoveContainer" containerID="11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72" Dec 03 17:20:20 crc kubenswrapper[4841]: E1203 17:20:20.834254 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72\": container with ID starting with 11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72 not found: ID does not exist" containerID="11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.834297 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72"} err="failed to get container status \"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72\": rpc error: code = NotFound desc = could not find container \"11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72\": container with ID starting with 11532e14b78eb043195d428b752aa242a911b7932be64f8fa897c3f347192d72 not found: ID does not exist" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.834323 4841 scope.go:117] "RemoveContainer" containerID="c7ab09e943b6d072659878d0bfaeee44c31b738d3fc855ff7338402a7e1f7cf8" Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.858361 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.878721 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-58569788fd-b9rl2"] Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.896485 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:20 crc kubenswrapper[4841]: I1203 17:20:20.909787 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-f96469445-n4lln"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.376843 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-cxq5q"] Dec 03 17:20:21 crc kubenswrapper[4841]: E1203 17:20:21.378576 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.378597 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" Dec 03 17:20:21 crc kubenswrapper[4841]: E1203 17:20:21.378604 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.378610 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.378782 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" containerName="heat-api" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.378806 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" containerName="heat-cfnapi" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.379462 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.388201 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cxq5q"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.438936 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.439141 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxjbt\" (UniqueName: \"kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.466219 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-cf3a-account-create-update-9qvw6"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.469345 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.474540 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.501067 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cf3a-account-create-update-9qvw6"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.541259 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf242\" (UniqueName: \"kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.541326 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.541384 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.541452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxjbt\" (UniqueName: \"kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.542583 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.565326 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxjbt\" (UniqueName: \"kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt\") pod \"nova-api-db-create-cxq5q\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.575176 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-f45bf"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.576376 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.582715 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f45bf"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.643244 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.643312 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wr7b\" (UniqueName: \"kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.643373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf242\" (UniqueName: \"kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.643399 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.644074 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.658532 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-njcv9"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.662578 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.666547 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf242\" (UniqueName: \"kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242\") pod \"nova-api-cf3a-account-create-update-9qvw6\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.681638 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-a208-account-create-update-fzsjz"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.683026 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.695411 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.695770 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-njcv9"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.696770 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.702318 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a208-account-create-update-fzsjz"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.745889 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w58t\" (UniqueName: \"kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.747995 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.748061 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wr7b\" (UniqueName: \"kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.748118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.748148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.748164 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bkt8\" (UniqueName: \"kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.749619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.767508 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wr7b\" (UniqueName: \"kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b\") pod \"nova-cell0-db-create-f45bf\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.796353 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.806117 4841 scope.go:117] "RemoveContainer" containerID="5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0" Dec 03 17:20:21 crc kubenswrapper[4841]: E1203 17:20:21.806486 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dd5698d74-2nhcr_openstack(0ada9b18-f961-4399-a186-a9ccb498d527)\"" pod="openstack/heat-api-7dd5698d74-2nhcr" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.820716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerStarted","Data":"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51"} Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.850132 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.850185 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.850209 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bkt8\" (UniqueName: \"kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.850336 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2w58t\" (UniqueName: \"kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.851079 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-a027-account-create-update-6gnsg"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.852122 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.852687 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.852701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.860392 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.864349 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a027-account-create-update-6gnsg"] Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.879894 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bkt8\" (UniqueName: \"kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8\") pod \"nova-cell1-db-create-njcv9\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.885751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w58t\" (UniqueName: \"kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t\") pod \"nova-cell0-a208-account-create-update-fzsjz\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.899004 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.955740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gck64\" (UniqueName: \"kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.955812 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:21 crc kubenswrapper[4841]: I1203 17:20:21.996549 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.017721 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.057864 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gck64\" (UniqueName: \"kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.057940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.058719 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.074829 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gck64\" (UniqueName: \"kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64\") pod \"nova-cell1-a027-account-create-update-6gnsg\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.281420 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.283121 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cdefc3b-a82b-4cff-9aca-d0dccec39e3a" path="/var/lib/kubelet/pods/0cdefc3b-a82b-4cff-9aca-d0dccec39e3a/volumes" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.283644 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12" path="/var/lib/kubelet/pods/cdb5b46f-ef21-412d-a1de-f8bf0dd8ef12/volumes" Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.352641 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-cxq5q"] Dec 03 17:20:22 crc kubenswrapper[4841]: W1203 17:20:22.393094 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b5e1317_cb39_41ff_bd74_0ff6f5888838.slice/crio-7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f WatchSource:0}: Error finding container 7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f: Status 404 returned error can't find the container with id 7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.465187 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-cf3a-account-create-update-9qvw6"] Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.533542 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.618984 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-f45bf"] Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.715660 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-a208-account-create-update-fzsjz"] Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.800917 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-njcv9"] Dec 03 17:20:22 crc kubenswrapper[4841]: W1203 17:20:22.803179 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddefee520_123a_4d81_89c9_3ac1100f37ea.slice/crio-6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279 WatchSource:0}: Error finding container 6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279: Status 404 returned error can't find the container with id 6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279 Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.849838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxq5q" event={"ID":"4b5e1317-cb39-41ff-bd74-0ff6f5888838","Type":"ContainerStarted","Data":"7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f"} Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.853409 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-njcv9" event={"ID":"defee520-123a-4d81-89c9-3ac1100f37ea","Type":"ContainerStarted","Data":"6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279"} Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.857075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f45bf" event={"ID":"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10","Type":"ContainerStarted","Data":"e7084292d1d12ad4e5ab725fd7fd4e937f583defaf2f8fa75c2d1dce84a33e05"} Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.861930 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" event={"ID":"bad0d4a1-7f8d-4417-bbb2-df84240ddb39","Type":"ContainerStarted","Data":"0f0ee92e20061f43762e9337cec75ec21789176ef252cc25f0fa1bce0fe21ce3"} Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.867194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" event={"ID":"8770bd7d-05a2-4cc4-9484-aad5e79ad58d","Type":"ContainerStarted","Data":"acd05fa16a1cff5f312a39f47b332e84b7d20898c3aee2abfc8fe784bf26f328"} Dec 03 17:20:22 crc kubenswrapper[4841]: I1203 17:20:22.889666 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-a027-account-create-update-6gnsg"] Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.357022 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.357062 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.357686 4841 scope.go:117] "RemoveContainer" containerID="5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0" Dec 03 17:20:23 crc kubenswrapper[4841]: E1203 17:20:23.357891 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-7dd5698d74-2nhcr_openstack(0ada9b18-f961-4399-a186-a9ccb498d527)\"" pod="openstack/heat-api-7dd5698d74-2nhcr" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.376444 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.376505 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.377509 4841 scope.go:117] "RemoveContainer" containerID="933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad" Dec 03 17:20:23 crc kubenswrapper[4841]: E1203 17:20:23.377867 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-6fd49975b6-6x566_openstack(79950490-c5c5-4db6-9121-220947907f6f)\"" pod="openstack/heat-cfnapi-6fd49975b6-6x566" podUID="79950490-c5c5-4db6-9121-220947907f6f" Dec 03 17:20:23 crc kubenswrapper[4841]: I1203 17:20:23.875829 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" event={"ID":"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8","Type":"ContainerStarted","Data":"c9b0c61c1b2056f1b45f267b92e4460e2cc69fb6ce9880225f650dfe01a86ebd"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.887420 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" containerID="435821fbe8b6d9cbd84e2fc94e04ec1826c70d793cac143da60cf134fb0cc891" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.887616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f45bf" event={"ID":"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10","Type":"ContainerDied","Data":"435821fbe8b6d9cbd84e2fc94e04ec1826c70d793cac143da60cf134fb0cc891"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.890803 4841 generic.go:334] "Generic (PLEG): container finished" podID="bad0d4a1-7f8d-4417-bbb2-df84240ddb39" containerID="76838e7afce317c8d2fb6cee10a9505ee704fa15b02fdfb938f26df657901729" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.890866 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" event={"ID":"bad0d4a1-7f8d-4417-bbb2-df84240ddb39","Type":"ContainerDied","Data":"76838e7afce317c8d2fb6cee10a9505ee704fa15b02fdfb938f26df657901729"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.893094 4841 generic.go:334] "Generic (PLEG): container finished" podID="8770bd7d-05a2-4cc4-9484-aad5e79ad58d" containerID="4637e098c6ee6399f08f3dfea946d0a93fa18ee9c880a15a020c8e732b91f40e" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.893146 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" event={"ID":"8770bd7d-05a2-4cc4-9484-aad5e79ad58d","Type":"ContainerDied","Data":"4637e098c6ee6399f08f3dfea946d0a93fa18ee9c880a15a020c8e732b91f40e"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.900428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerStarted","Data":"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.900998 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.900995 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-central-agent" containerID="cri-o://29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71" gracePeriod=30 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.901022 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="proxy-httpd" containerID="cri-o://6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d" gracePeriod=30 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.901024 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="sg-core" containerID="cri-o://a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51" gracePeriod=30 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.901108 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-notification-agent" containerID="cri-o://e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce" gracePeriod=30 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.911178 4841 generic.go:334] "Generic (PLEG): container finished" podID="4b5e1317-cb39-41ff-bd74-0ff6f5888838" containerID="e9a29522c92532e17e041a2e394eace09dfce8e8d095eac48349107022dd2729" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.911511 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxq5q" event={"ID":"4b5e1317-cb39-41ff-bd74-0ff6f5888838","Type":"ContainerDied","Data":"e9a29522c92532e17e041a2e394eace09dfce8e8d095eac48349107022dd2729"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.913637 4841 generic.go:334] "Generic (PLEG): container finished" podID="3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" containerID="cbe654b0b644c77d1c3a980307cd97270ee6d59e76f24452b50391f0349cb1c3" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.913703 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" event={"ID":"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8","Type":"ContainerDied","Data":"cbe654b0b644c77d1c3a980307cd97270ee6d59e76f24452b50391f0349cb1c3"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.915888 4841 generic.go:334] "Generic (PLEG): container finished" podID="defee520-123a-4d81-89c9-3ac1100f37ea" containerID="e81b656705ef60f35ed3d450b402e5808b1cda91c2c3177cd557019368cd9b69" exitCode=0 Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.915936 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-njcv9" event={"ID":"defee520-123a-4d81-89c9-3ac1100f37ea","Type":"ContainerDied","Data":"e81b656705ef60f35ed3d450b402e5808b1cda91c2c3177cd557019368cd9b69"} Dec 03 17:20:24 crc kubenswrapper[4841]: I1203 17:20:24.929702 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.744377418 podStartE2EDuration="6.929680783s" podCreationTimestamp="2025-12-03 17:20:18 +0000 UTC" firstStartedPulling="2025-12-03 17:20:18.881428292 +0000 UTC m=+1213.268949019" lastFinishedPulling="2025-12-03 17:20:24.066731647 +0000 UTC m=+1218.454252384" observedRunningTime="2025-12-03 17:20:24.923877418 +0000 UTC m=+1219.311398145" watchObservedRunningTime="2025-12-03 17:20:24.929680783 +0000 UTC m=+1219.317201510" Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.759184 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.759467 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-log" containerID="cri-o://aaf61c90e47e2bab624d684c2d66061e20f328133d254a00df33a94a9fcbbb8c" gracePeriod=30 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.759637 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-httpd" containerID="cri-o://e2558236d0746a73d50a3a05b611cd0dba02053bc82e2eeab22e659894be5e5b" gracePeriod=30 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942123 4841 generic.go:334] "Generic (PLEG): container finished" podID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerID="6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d" exitCode=0 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942747 4841 generic.go:334] "Generic (PLEG): container finished" podID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerID="a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51" exitCode=2 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942775 4841 generic.go:334] "Generic (PLEG): container finished" podID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerID="e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce" exitCode=0 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942137 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerDied","Data":"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d"} Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942925 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerDied","Data":"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51"} Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.942943 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerDied","Data":"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce"} Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.945175 4841 generic.go:334] "Generic (PLEG): container finished" podID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerID="aaf61c90e47e2bab624d684c2d66061e20f328133d254a00df33a94a9fcbbb8c" exitCode=143 Dec 03 17:20:25 crc kubenswrapper[4841]: I1203 17:20:25.945336 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerDied","Data":"aaf61c90e47e2bab624d684c2d66061e20f328133d254a00df33a94a9fcbbb8c"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.299383 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-799f468d8f-qbwl4" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.364737 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.410231 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.460377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w58t\" (UniqueName: \"kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t\") pod \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.460510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts\") pod \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\" (UID: \"bad0d4a1-7f8d-4417-bbb2-df84240ddb39\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.461398 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bad0d4a1-7f8d-4417-bbb2-df84240ddb39" (UID: "bad0d4a1-7f8d-4417-bbb2-df84240ddb39"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.468156 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t" (OuterVolumeSpecName: "kube-api-access-2w58t") pod "bad0d4a1-7f8d-4417-bbb2-df84240ddb39" (UID: "bad0d4a1-7f8d-4417-bbb2-df84240ddb39"). InnerVolumeSpecName "kube-api-access-2w58t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.563155 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w58t\" (UniqueName: \"kubernetes.io/projected/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-kube-api-access-2w58t\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.563387 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bad0d4a1-7f8d-4417-bbb2-df84240ddb39-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.579397 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.590578 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-c5f5d9bb6-ddbgn" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.604154 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.606531 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.617175 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664263 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gck64\" (UniqueName: \"kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64\") pod \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664356 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6wr7b\" (UniqueName: \"kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b\") pod \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664384 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sf242\" (UniqueName: \"kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242\") pod \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664428 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts\") pod \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\" (UID: \"8770bd7d-05a2-4cc4-9484-aad5e79ad58d\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664445 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts\") pod \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\" (UID: \"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664485 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxjbt\" (UniqueName: \"kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt\") pod \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664518 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts\") pod \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\" (UID: \"4b5e1317-cb39-41ff-bd74-0ff6f5888838\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.664562 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts\") pod \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\" (UID: \"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.674825 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" (UID: "3b588dfb-c0c3-44c9-bb78-88f38c32d7d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.676113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" (UID: "8fa56c6b-2244-457e-a7ea-cf0f61bc8b10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.683062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b" (OuterVolumeSpecName: "kube-api-access-6wr7b") pod "8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" (UID: "8fa56c6b-2244-457e-a7ea-cf0f61bc8b10"). InnerVolumeSpecName "kube-api-access-6wr7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.687456 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64" (OuterVolumeSpecName: "kube-api-access-gck64") pod "3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" (UID: "3b588dfb-c0c3-44c9-bb78-88f38c32d7d8"). InnerVolumeSpecName "kube-api-access-gck64". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.687606 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8770bd7d-05a2-4cc4-9484-aad5e79ad58d" (UID: "8770bd7d-05a2-4cc4-9484-aad5e79ad58d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.699190 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt" (OuterVolumeSpecName: "kube-api-access-lxjbt") pod "4b5e1317-cb39-41ff-bd74-0ff6f5888838" (UID: "4b5e1317-cb39-41ff-bd74-0ff6f5888838"). InnerVolumeSpecName "kube-api-access-lxjbt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.705870 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242" (OuterVolumeSpecName: "kube-api-access-sf242") pod "8770bd7d-05a2-4cc4-9484-aad5e79ad58d" (UID: "8770bd7d-05a2-4cc4-9484-aad5e79ad58d"). InnerVolumeSpecName "kube-api-access-sf242". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.707296 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4b5e1317-cb39-41ff-bd74-0ff6f5888838" (UID: "4b5e1317-cb39-41ff-bd74-0ff6f5888838"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.729506 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767438 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gck64\" (UniqueName: \"kubernetes.io/projected/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-kube-api-access-gck64\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767486 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6wr7b\" (UniqueName: \"kubernetes.io/projected/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-kube-api-access-6wr7b\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767500 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sf242\" (UniqueName: \"kubernetes.io/projected/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-kube-api-access-sf242\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767513 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8770bd7d-05a2-4cc4-9484-aad5e79ad58d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767525 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767538 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxjbt\" (UniqueName: \"kubernetes.io/projected/4b5e1317-cb39-41ff-bd74-0ff6f5888838-kube-api-access-lxjbt\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767549 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4b5e1317-cb39-41ff-bd74-0ff6f5888838-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.767576 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.815148 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.868805 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts\") pod \"defee520-123a-4d81-89c9-3ac1100f37ea\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.868976 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bkt8\" (UniqueName: \"kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8\") pod \"defee520-123a-4d81-89c9-3ac1100f37ea\" (UID: \"defee520-123a-4d81-89c9-3ac1100f37ea\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.869419 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "defee520-123a-4d81-89c9-3ac1100f37ea" (UID: "defee520-123a-4d81-89c9-3ac1100f37ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.869557 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/defee520-123a-4d81-89c9-3ac1100f37ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.874451 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8" (OuterVolumeSpecName: "kube-api-access-4bkt8") pod "defee520-123a-4d81-89c9-3ac1100f37ea" (UID: "defee520-123a-4d81-89c9-3ac1100f37ea"). InnerVolumeSpecName "kube-api-access-4bkt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.909697 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.955648 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" event={"ID":"bad0d4a1-7f8d-4417-bbb2-df84240ddb39","Type":"ContainerDied","Data":"0f0ee92e20061f43762e9337cec75ec21789176ef252cc25f0fa1bce0fe21ce3"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.955672 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-a208-account-create-update-fzsjz" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.955685 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f0ee92e20061f43762e9337cec75ec21789176ef252cc25f0fa1bce0fe21ce3" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.956995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" event={"ID":"8770bd7d-05a2-4cc4-9484-aad5e79ad58d","Type":"ContainerDied","Data":"acd05fa16a1cff5f312a39f47b332e84b7d20898c3aee2abfc8fe784bf26f328"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.957017 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd05fa16a1cff5f312a39f47b332e84b7d20898c3aee2abfc8fe784bf26f328" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.957031 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-cf3a-account-create-update-9qvw6" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.960430 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-cxq5q" event={"ID":"4b5e1317-cb39-41ff-bd74-0ff6f5888838","Type":"ContainerDied","Data":"7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.960452 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a55744ba54f5ba2ceb553d65ced256d38f33c32211b42e73fef02e067f65f3f" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.960468 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-cxq5q" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.967187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" event={"ID":"3b588dfb-c0c3-44c9-bb78-88f38c32d7d8","Type":"ContainerDied","Data":"c9b0c61c1b2056f1b45f267b92e4460e2cc69fb6ce9880225f650dfe01a86ebd"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.967213 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9b0c61c1b2056f1b45f267b92e4460e2cc69fb6ce9880225f650dfe01a86ebd" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.967259 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-a027-account-create-update-6gnsg" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.970726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle\") pod \"0ada9b18-f961-4399-a186-a9ccb498d527\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.970794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom\") pod \"0ada9b18-f961-4399-a186-a9ccb498d527\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.970817 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcj5r\" (UniqueName: \"kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r\") pod \"0ada9b18-f961-4399-a186-a9ccb498d527\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.970939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data\") pod \"0ada9b18-f961-4399-a186-a9ccb498d527\" (UID: \"0ada9b18-f961-4399-a186-a9ccb498d527\") " Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.971143 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-7dd5698d74-2nhcr" event={"ID":"0ada9b18-f961-4399-a186-a9ccb498d527","Type":"ContainerDied","Data":"73b8b146031cb5422856aab71b62ed4be9a1f94ec9c76a462b53803a1062b701"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.971177 4841 scope.go:117] "RemoveContainer" containerID="5a2ce045cfc856c7a2ca86ac855f91646c1ef62e7501cf2c8082980302213cd0" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.971232 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-7dd5698d74-2nhcr" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.971600 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bkt8\" (UniqueName: \"kubernetes.io/projected/defee520-123a-4d81-89c9-3ac1100f37ea-kube-api-access-4bkt8\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.972939 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-njcv9" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.972899 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-njcv9" event={"ID":"defee520-123a-4d81-89c9-3ac1100f37ea","Type":"ContainerDied","Data":"6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.972978 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6eb489d25690fbdd4cb94083b621b4198f2e684c2dc595cf4b5553add56a2279" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.975723 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0ada9b18-f961-4399-a186-a9ccb498d527" (UID: "0ada9b18-f961-4399-a186-a9ccb498d527"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.976798 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r" (OuterVolumeSpecName: "kube-api-access-bcj5r") pod "0ada9b18-f961-4399-a186-a9ccb498d527" (UID: "0ada9b18-f961-4399-a186-a9ccb498d527"). InnerVolumeSpecName "kube-api-access-bcj5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.988765 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-f45bf" event={"ID":"8fa56c6b-2244-457e-a7ea-cf0f61bc8b10","Type":"ContainerDied","Data":"e7084292d1d12ad4e5ab725fd7fd4e937f583defaf2f8fa75c2d1dce84a33e05"} Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.988801 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7084292d1d12ad4e5ab725fd7fd4e937f583defaf2f8fa75c2d1dce84a33e05" Dec 03 17:20:26 crc kubenswrapper[4841]: I1203 17:20:26.988859 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-f45bf" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.013553 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0ada9b18-f961-4399-a186-a9ccb498d527" (UID: "0ada9b18-f961-4399-a186-a9ccb498d527"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.057050 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data" (OuterVolumeSpecName: "config-data") pod "0ada9b18-f961-4399-a186-a9ccb498d527" (UID: "0ada9b18-f961-4399-a186-a9ccb498d527"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.072888 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.072925 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcj5r\" (UniqueName: \"kubernetes.io/projected/0ada9b18-f961-4399-a186-a9ccb498d527-kube-api-access-bcj5r\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.072935 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.075377 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0ada9b18-f961-4399-a186-a9ccb498d527-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.143756 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.176630 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpgpg\" (UniqueName: \"kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg\") pod \"79950490-c5c5-4db6-9121-220947907f6f\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.176699 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom\") pod \"79950490-c5c5-4db6-9121-220947907f6f\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.176850 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data\") pod \"79950490-c5c5-4db6-9121-220947907f6f\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.176872 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle\") pod \"79950490-c5c5-4db6-9121-220947907f6f\" (UID: \"79950490-c5c5-4db6-9121-220947907f6f\") " Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.180686 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg" (OuterVolumeSpecName: "kube-api-access-tpgpg") pod "79950490-c5c5-4db6-9121-220947907f6f" (UID: "79950490-c5c5-4db6-9121-220947907f6f"). InnerVolumeSpecName "kube-api-access-tpgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.184016 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "79950490-c5c5-4db6-9121-220947907f6f" (UID: "79950490-c5c5-4db6-9121-220947907f6f"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.201688 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79950490-c5c5-4db6-9121-220947907f6f" (UID: "79950490-c5c5-4db6-9121-220947907f6f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.222740 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data" (OuterVolumeSpecName: "config-data") pod "79950490-c5c5-4db6-9121-220947907f6f" (UID: "79950490-c5c5-4db6-9121-220947907f6f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.280652 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.280678 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.280689 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpgpg\" (UniqueName: \"kubernetes.io/projected/79950490-c5c5-4db6-9121-220947907f6f-kube-api-access-tpgpg\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.280697 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/79950490-c5c5-4db6-9121-220947907f6f-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.332987 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.340425 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-7dd5698d74-2nhcr"] Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.545651 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.546116 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-log" containerID="cri-o://100f840d1f2804bb16eec10829512b68717d3687c059282364e1be370829b397" gracePeriod=30 Dec 03 17:20:27 crc kubenswrapper[4841]: I1203 17:20:27.546177 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-httpd" containerID="cri-o://b8032a3c094bb56d3d5f5ce52a22a9c276d899fcdea69130fe26a77e81be7ca7" gracePeriod=30 Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.000640 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-6fd49975b6-6x566" event={"ID":"79950490-c5c5-4db6-9121-220947907f6f","Type":"ContainerDied","Data":"11269c1c7e1654aa5196450fa5a5a2052a8c2436ad0985d456f10752d9a6b094"} Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.000958 4841 scope.go:117] "RemoveContainer" containerID="933ac04ff8b576cb70daf62f9d587923d785eded85c83a4a3977e794156c04ad" Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.000699 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-6fd49975b6-6x566" Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.002562 4841 generic.go:334] "Generic (PLEG): container finished" podID="749c6395-a67d-40f3-9347-206393f91228" containerID="100f840d1f2804bb16eec10829512b68717d3687c059282364e1be370829b397" exitCode=143 Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.002611 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerDied","Data":"100f840d1f2804bb16eec10829512b68717d3687c059282364e1be370829b397"} Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.099882 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.107697 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-6fd49975b6-6x566"] Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.250712 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" path="/var/lib/kubelet/pods/0ada9b18-f961-4399-a186-a9ccb498d527/volumes" Dec 03 17:20:28 crc kubenswrapper[4841]: I1203 17:20:28.251251 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79950490-c5c5-4db6-9121-220947907f6f" path="/var/lib/kubelet/pods/79950490-c5c5-4db6-9121-220947907f6f/volumes" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.023791 4841 generic.go:334] "Generic (PLEG): container finished" podID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerID="e2558236d0746a73d50a3a05b611cd0dba02053bc82e2eeab22e659894be5e5b" exitCode=0 Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.023889 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerDied","Data":"e2558236d0746a73d50a3a05b611cd0dba02053bc82e2eeab22e659894be5e5b"} Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.605616 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731618 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731783 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731869 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzzkc\" (UniqueName: \"kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.731990 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.732016 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts\") pod \"51ec7e80-3ef6-478e-b388-5835beb1c733\" (UID: \"51ec7e80-3ef6-478e-b388-5835beb1c733\") " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.733223 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs" (OuterVolumeSpecName: "logs") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.733471 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.738029 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts" (OuterVolumeSpecName: "scripts") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.745066 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc" (OuterVolumeSpecName: "kube-api-access-qzzkc") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "kube-api-access-qzzkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.749994 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.782113 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.810990 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data" (OuterVolumeSpecName: "config-data") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834068 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834107 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834152 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834162 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834172 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834181 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzzkc\" (UniqueName: \"kubernetes.io/projected/51ec7e80-3ef6-478e-b388-5835beb1c733-kube-api-access-qzzkc\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.834190 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/51ec7e80-3ef6-478e-b388-5835beb1c733-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.840257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "51ec7e80-3ef6-478e-b388-5835beb1c733" (UID: "51ec7e80-3ef6-478e-b388-5835beb1c733"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.843099 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.859436 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.937308 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/51ec7e80-3ef6-478e-b388-5835beb1c733-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:29 crc kubenswrapper[4841]: I1203 17:20:29.937334 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.036497 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.037924 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7n6f\" (UniqueName: \"kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038030 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038068 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038198 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038237 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038268 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data\") pod \"1cda072f-d517-4ac5-91f7-e65b07969c55\" (UID: \"1cda072f-d517-4ac5-91f7-e65b07969c55\") " Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038299 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038575 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.038666 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.037354 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"51ec7e80-3ef6-478e-b388-5835beb1c733","Type":"ContainerDied","Data":"a1e807e1e69a3892c513ca6d36478667318c9aefcc7f05faad42e0fa6d3b9759"} Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.041369 4841 generic.go:334] "Generic (PLEG): container finished" podID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerID="29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71" exitCode=0 Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.041440 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.041438 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerDied","Data":"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71"} Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.041475 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cda072f-d517-4ac5-91f7-e65b07969c55","Type":"ContainerDied","Data":"0c7a044aff2600adc7b75d726f9e831a690bf01aff1044c96cb02833e98d8067"} Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.041506 4841 scope.go:117] "RemoveContainer" containerID="e2558236d0746a73d50a3a05b611cd0dba02053bc82e2eeab22e659894be5e5b" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.047980 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts" (OuterVolumeSpecName: "scripts") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.061621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f" (OuterVolumeSpecName: "kube-api-access-s7n6f") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "kube-api-access-s7n6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.071098 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.141009 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.141035 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cda072f-d517-4ac5-91f7-e65b07969c55-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.141045 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7n6f\" (UniqueName: \"kubernetes.io/projected/1cda072f-d517-4ac5-91f7-e65b07969c55-kube-api-access-s7n6f\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.141056 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.161018 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.186971 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data" (OuterVolumeSpecName: "config-data") pod "1cda072f-d517-4ac5-91f7-e65b07969c55" (UID: "1cda072f-d517-4ac5-91f7-e65b07969c55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.242281 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.242305 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cda072f-d517-4ac5-91f7-e65b07969c55-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.266100 4841 scope.go:117] "RemoveContainer" containerID="aaf61c90e47e2bab624d684c2d66061e20f328133d254a00df33a94a9fcbbb8c" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.329349 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.329387 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.329760 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330347 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330378 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330391 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330398 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330412 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-notification-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330439 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-notification-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330453 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8770bd7d-05a2-4cc4-9484-aad5e79ad58d" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330458 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8770bd7d-05a2-4cc4-9484-aad5e79ad58d" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330466 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-log" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330472 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-log" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330480 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330486 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330494 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b5e1317-cb39-41ff-bd74-0ff6f5888838" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330529 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b5e1317-cb39-41ff-bd74-0ff6f5888838" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330539 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="defee520-123a-4d81-89c9-3ac1100f37ea" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330545 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="defee520-123a-4d81-89c9-3ac1100f37ea" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330554 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330559 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330566 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330572 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330599 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330621 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330639 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-central-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330645 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-central-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330653 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330658 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330668 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="proxy-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330674 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="proxy-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330684 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bad0d4a1-7f8d-4417-bbb2-df84240ddb39" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330690 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bad0d4a1-7f8d-4417-bbb2-df84240ddb39" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.330701 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="sg-core" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330707 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="sg-core" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330881 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-notification-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330894 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="sg-core" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330933 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330946 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="ceilometer-central-agent" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330953 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ada9b18-f961-4399-a186-a9ccb498d527" containerName="heat-api" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330961 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bad0d4a1-7f8d-4417-bbb2-df84240ddb39" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330973 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330984 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330994 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.330999 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" containerName="glance-log" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331010 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" containerName="proxy-httpd" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331020 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b5e1317-cb39-41ff-bd74-0ff6f5888838" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331031 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8770bd7d-05a2-4cc4-9484-aad5e79ad58d" containerName="mariadb-account-create-update" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331043 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="defee520-123a-4d81-89c9-3ac1100f37ea" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331052 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" containerName="mariadb-database-create" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.331531 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="79950490-c5c5-4db6-9121-220947907f6f" containerName="heat-cfnapi" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.332159 4841 scope.go:117] "RemoveContainer" containerID="6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.332599 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.338105 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.338254 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.339450 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.383107 4841 scope.go:117] "RemoveContainer" containerID="a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.399205 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.407083 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.421753 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.423882 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.426275 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.427027 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.432394 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.454366 4841 scope.go:117] "RemoveContainer" containerID="e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.465720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.465767 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.465809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.465836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.465854 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5xpz\" (UniqueName: \"kubernetes.io/projected/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-kube-api-access-l5xpz\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.466603 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.466762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.467043 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.480789 4841 scope.go:117] "RemoveContainer" containerID="29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.506841 4841 scope.go:117] "RemoveContainer" containerID="6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.508448 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d\": container with ID starting with 6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d not found: ID does not exist" containerID="6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.508493 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d"} err="failed to get container status \"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d\": rpc error: code = NotFound desc = could not find container \"6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d\": container with ID starting with 6b75c7644013c0b58c23a7f1916a017835a2fdb43a45e49b93816214c0c9b41d not found: ID does not exist" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.508521 4841 scope.go:117] "RemoveContainer" containerID="a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.510495 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51\": container with ID starting with a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51 not found: ID does not exist" containerID="a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.510540 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51"} err="failed to get container status \"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51\": rpc error: code = NotFound desc = could not find container \"a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51\": container with ID starting with a7450491facf3b7c778e8f3cf846edddd6d201b71fa30ddc6dcd1012a3677c51 not found: ID does not exist" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.510589 4841 scope.go:117] "RemoveContainer" containerID="e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.510857 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce\": container with ID starting with e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce not found: ID does not exist" containerID="e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.510915 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce"} err="failed to get container status \"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce\": rpc error: code = NotFound desc = could not find container \"e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce\": container with ID starting with e96d769977e0f6daec133ad7ee934688d1ba45f436c96f8c085eb1c581117cce not found: ID does not exist" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.510931 4841 scope.go:117] "RemoveContainer" containerID="29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71" Dec 03 17:20:30 crc kubenswrapper[4841]: E1203 17:20:30.511167 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71\": container with ID starting with 29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71 not found: ID does not exist" containerID="29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.511186 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71"} err="failed to get container status \"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71\": rpc error: code = NotFound desc = could not find container \"29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71\": container with ID starting with 29b6f96102116acc2c54a04a3cb52924feea4528e36c5902324368bbd438ba71 not found: ID does not exist" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568396 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568445 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568493 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-772gq\" (UniqueName: \"kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568555 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568581 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568600 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568620 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568641 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568663 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568695 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568714 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568731 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.568770 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5xpz\" (UniqueName: \"kubernetes.io/projected/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-kube-api-access-l5xpz\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.569342 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-logs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.569495 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.571303 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.573670 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-config-data\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.575966 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.580186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-scripts\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.580407 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.587697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5xpz\" (UniqueName: \"kubernetes.io/projected/3c031e76-65c3-4f33-a588-3a76aa8a2c0b-kube-api-access-l5xpz\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.598064 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"3c031e76-65c3-4f33-a588-3a76aa8a2c0b\") " pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-772gq\" (UniqueName: \"kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670856 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670881 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670919 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670937 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670962 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.670993 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.671874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.671887 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.674624 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.674639 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.675061 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.675464 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.687033 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-772gq\" (UniqueName: \"kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq\") pod \"ceilometer-0\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " pod="openstack/ceilometer-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.687689 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 17:20:30 crc kubenswrapper[4841]: I1203 17:20:30.765797 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.078965 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.122899 4841 generic.go:334] "Generic (PLEG): container finished" podID="749c6395-a67d-40f3-9347-206393f91228" containerID="b8032a3c094bb56d3d5f5ce52a22a9c276d899fcdea69130fe26a77e81be7ca7" exitCode=0 Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.122954 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerDied","Data":"b8032a3c094bb56d3d5f5ce52a22a9c276d899fcdea69130fe26a77e81be7ca7"} Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.333414 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.404916 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.523472 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704615 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704802 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704833 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704861 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704878 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.704896 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.705024 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd426\" (UniqueName: \"kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.705047 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts\") pod \"749c6395-a67d-40f3-9347-206393f91228\" (UID: \"749c6395-a67d-40f3-9347-206393f91228\") " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.707424 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.707605 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs" (OuterVolumeSpecName: "logs") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.713501 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.725239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts" (OuterVolumeSpecName: "scripts") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.740660 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.741691 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426" (OuterVolumeSpecName: "kube-api-access-nd426") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "kube-api-access-nd426". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.769052 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data" (OuterVolumeSpecName: "config-data") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.771351 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "749c6395-a67d-40f3-9347-206393f91228" (UID: "749c6395-a67d-40f3-9347-206393f91228"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.806929 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.806963 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.806973 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.806981 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.806990 4841 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/749c6395-a67d-40f3-9347-206393f91228-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.807001 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd426\" (UniqueName: \"kubernetes.io/projected/749c6395-a67d-40f3-9347-206393f91228-kube-api-access-nd426\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.807008 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.807017 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/749c6395-a67d-40f3-9347-206393f91228-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.825290 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 17:20:31 crc kubenswrapper[4841]: I1203 17:20:31.912868 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.127131 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gddr9"] Dec 03 17:20:32 crc kubenswrapper[4841]: E1203 17:20:32.127631 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-httpd" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.127646 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-httpd" Dec 03 17:20:32 crc kubenswrapper[4841]: E1203 17:20:32.127670 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-log" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.127677 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-log" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.127866 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-httpd" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.127884 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="749c6395-a67d-40f3-9347-206393f91228" containerName="glance-log" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.128827 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.130439 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dmjlh" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.133974 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.134148 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.138242 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gddr9"] Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.146145 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.146825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"749c6395-a67d-40f3-9347-206393f91228","Type":"ContainerDied","Data":"fe89a96ac4cacdfeec1e4a4f568c21786cd28ebe31e752d237149751ff912400"} Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.146885 4841 scope.go:117] "RemoveContainer" containerID="b8032a3c094bb56d3d5f5ce52a22a9c276d899fcdea69130fe26a77e81be7ca7" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.162586 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerStarted","Data":"6be2ecd6befa11342ee460841c947aca96648b04ed9a465545fc89bae8314224"} Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.168100 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c031e76-65c3-4f33-a588-3a76aa8a2c0b","Type":"ContainerStarted","Data":"4f037ee83bf761730f53c29417a259b2f122fb83948d3a183f812e7f61476da6"} Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.168148 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c031e76-65c3-4f33-a588-3a76aa8a2c0b","Type":"ContainerStarted","Data":"462c0c67b318093eb6067d4b24ae7af44a6863787823dbff9963613bcd9fd505"} Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.194883 4841 scope.go:117] "RemoveContainer" containerID="100f840d1f2804bb16eec10829512b68717d3687c059282364e1be370829b397" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.228104 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.300951 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cda072f-d517-4ac5-91f7-e65b07969c55" path="/var/lib/kubelet/pods/1cda072f-d517-4ac5-91f7-e65b07969c55/volumes" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.301996 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51ec7e80-3ef6-478e-b388-5835beb1c733" path="/var/lib/kubelet/pods/51ec7e80-3ef6-478e-b388-5835beb1c733/volumes" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.305835 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.306873 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.322185 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mg9x8\" (UniqueName: \"kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.322295 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.322348 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.322415 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.324004 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.328285 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.337341 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.338118 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.423995 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424114 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424141 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424172 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424209 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424243 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424291 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxxr2\" (UniqueName: \"kubernetes.io/projected/a0e112d6-6648-48f5-872e-f4ac5e81de4e-kube-api-access-zxxr2\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424349 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mg9x8\" (UniqueName: \"kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424403 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424427 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.424462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.437975 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.438744 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.447021 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mg9x8\" (UniqueName: \"kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.452536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts\") pod \"nova-cell0-conductor-db-sync-gddr9\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.465219 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530004 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxxr2\" (UniqueName: \"kubernetes.io/projected/a0e112d6-6648-48f5-872e-f4ac5e81de4e-kube-api-access-zxxr2\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530151 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530203 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530263 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530310 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.530339 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.531637 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.533534 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-logs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.534576 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a0e112d6-6648-48f5-872e-f4ac5e81de4e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.539482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.546492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.564513 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxxr2\" (UniqueName: \"kubernetes.io/projected/a0e112d6-6648-48f5-872e-f4ac5e81de4e-kube-api-access-zxxr2\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.578602 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.583703 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0e112d6-6648-48f5-872e-f4ac5e81de4e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.623874 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-0\" (UID: \"a0e112d6-6648-48f5-872e-f4ac5e81de4e\") " pod="openstack/glance-default-internal-api-0" Dec 03 17:20:32 crc kubenswrapper[4841]: I1203 17:20:32.651527 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.131549 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gddr9"] Dec 03 17:20:33 crc kubenswrapper[4841]: W1203 17:20:33.151191 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c71fa5a_194d_4d9f_a591_c4db053a2b01.slice/crio-6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139 WatchSource:0}: Error finding container 6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139: Status 404 returned error can't find the container with id 6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139 Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.209137 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerStarted","Data":"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674"} Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.213370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"3c031e76-65c3-4f33-a588-3a76aa8a2c0b","Type":"ContainerStarted","Data":"49954cb5192594adc8b6a2319fe8987b98c6cf4490938aa6af02cdbb67058de7"} Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.218203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gddr9" event={"ID":"1c71fa5a-194d-4d9f-a591-c4db053a2b01","Type":"ContainerStarted","Data":"6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139"} Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.261128 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.261102718 podStartE2EDuration="3.261102718s" podCreationTimestamp="2025-12-03 17:20:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:33.240176689 +0000 UTC m=+1227.627697416" watchObservedRunningTime="2025-12-03 17:20:33.261102718 +0000 UTC m=+1227.648623445" Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.369030 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.398568 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-555b568f78-v86bh" Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.445598 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:33 crc kubenswrapper[4841]: I1203 17:20:33.445828 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-7848f458c5-zfrpj" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" containerID="cri-o://55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" gracePeriod=60 Dec 03 17:20:34 crc kubenswrapper[4841]: I1203 17:20:34.230237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0e112d6-6648-48f5-872e-f4ac5e81de4e","Type":"ContainerStarted","Data":"cb46f7d39ea597840c729c5a52d6690b3f8245177fb8558d04c4a5ee82ade931"} Dec 03 17:20:34 crc kubenswrapper[4841]: I1203 17:20:34.230484 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0e112d6-6648-48f5-872e-f4ac5e81de4e","Type":"ContainerStarted","Data":"eb5c57fb9d5dbe2581dd854706754ca9f938e2519e0ef63db48cffebe18d4532"} Dec 03 17:20:34 crc kubenswrapper[4841]: I1203 17:20:34.233732 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerStarted","Data":"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6"} Dec 03 17:20:34 crc kubenswrapper[4841]: I1203 17:20:34.233758 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerStarted","Data":"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84"} Dec 03 17:20:34 crc kubenswrapper[4841]: I1203 17:20:34.249160 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="749c6395-a67d-40f3-9347-206393f91228" path="/var/lib/kubelet/pods/749c6395-a67d-40f3-9347-206393f91228/volumes" Dec 03 17:20:35 crc kubenswrapper[4841]: I1203 17:20:35.246194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a0e112d6-6648-48f5-872e-f4ac5e81de4e","Type":"ContainerStarted","Data":"723ae65afdb8cf1cd809684e177b0fe03930f1c67f04acb95eef8b70bf278296"} Dec 03 17:20:35 crc kubenswrapper[4841]: I1203 17:20:35.265800 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.265779498 podStartE2EDuration="3.265779498s" podCreationTimestamp="2025-12-03 17:20:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:35.264257323 +0000 UTC m=+1229.651778050" watchObservedRunningTime="2025-12-03 17:20:35.265779498 +0000 UTC m=+1229.653300225" Dec 03 17:20:36 crc kubenswrapper[4841]: E1203 17:20:36.084156 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:36 crc kubenswrapper[4841]: E1203 17:20:36.095288 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:36 crc kubenswrapper[4841]: E1203 17:20:36.096708 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:36 crc kubenswrapper[4841]: E1203 17:20:36.096737 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-7848f458c5-zfrpj" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerStarted","Data":"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c"} Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285706 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-central-agent" containerID="cri-o://51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674" gracePeriod=30 Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285769 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="sg-core" containerID="cri-o://d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6" gracePeriod=30 Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285795 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285813 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-notification-agent" containerID="cri-o://362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84" gracePeriod=30 Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.285767 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="proxy-httpd" containerID="cri-o://78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c" gracePeriod=30 Dec 03 17:20:37 crc kubenswrapper[4841]: I1203 17:20:37.326161 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.663863035 podStartE2EDuration="7.3261412s" podCreationTimestamp="2025-12-03 17:20:30 +0000 UTC" firstStartedPulling="2025-12-03 17:20:31.42106835 +0000 UTC m=+1225.808589077" lastFinishedPulling="2025-12-03 17:20:36.083346515 +0000 UTC m=+1230.470867242" observedRunningTime="2025-12-03 17:20:37.311217301 +0000 UTC m=+1231.698738028" watchObservedRunningTime="2025-12-03 17:20:37.3261412 +0000 UTC m=+1231.713661927" Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298097 4841 generic.go:334] "Generic (PLEG): container finished" podID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerID="78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c" exitCode=0 Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298451 4841 generic.go:334] "Generic (PLEG): container finished" podID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerID="d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6" exitCode=2 Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298466 4841 generic.go:334] "Generic (PLEG): container finished" podID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerID="362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84" exitCode=0 Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerDied","Data":"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c"} Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298514 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerDied","Data":"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6"} Dec 03 17:20:38 crc kubenswrapper[4841]: I1203 17:20:38.298535 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerDied","Data":"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84"} Dec 03 17:20:40 crc kubenswrapper[4841]: I1203 17:20:40.688196 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 17:20:40 crc kubenswrapper[4841]: I1203 17:20:40.688445 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 17:20:40 crc kubenswrapper[4841]: I1203 17:20:40.738494 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 17:20:40 crc kubenswrapper[4841]: I1203 17:20:40.738570 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 17:20:41 crc kubenswrapper[4841]: I1203 17:20:41.326558 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 17:20:41 crc kubenswrapper[4841]: I1203 17:20:41.326802 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 17:20:42 crc kubenswrapper[4841]: I1203 17:20:42.653191 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:42 crc kubenswrapper[4841]: I1203 17:20:42.653563 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:42 crc kubenswrapper[4841]: I1203 17:20:42.694151 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:42 crc kubenswrapper[4841]: I1203 17:20:42.699408 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.291876 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342064 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342172 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342212 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-772gq\" (UniqueName: \"kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342299 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342408 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd\") pod \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\" (UID: \"60e1d74b-b77f-47d3-9325-9e2557bee1d7\") " Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.342887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.343355 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.347498 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq" (OuterVolumeSpecName: "kube-api-access-772gq") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "kube-api-access-772gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.347855 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts" (OuterVolumeSpecName: "scripts") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.379149 4841 generic.go:334] "Generic (PLEG): container finished" podID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerID="51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674" exitCode=0 Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.379213 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerDied","Data":"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674"} Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.379240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"60e1d74b-b77f-47d3-9325-9e2557bee1d7","Type":"ContainerDied","Data":"6be2ecd6befa11342ee460841c947aca96648b04ed9a465545fc89bae8314224"} Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.379257 4841 scope.go:117] "RemoveContainer" containerID="78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.379380 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.380747 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.385726 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.387163 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.387262 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gddr9" event={"ID":"1c71fa5a-194d-4d9f-a591-c4db053a2b01","Type":"ContainerStarted","Data":"0dfa8076e353712ef5974cfe0b2ae72a1df067ccc7df6f10311934ebd56b28b4"} Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.388521 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.388683 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.391346 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.411523 4841 scope.go:117] "RemoveContainer" containerID="d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.444250 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.444275 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.444284 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.444293 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/60e1d74b-b77f-47d3-9325-9e2557bee1d7-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.444301 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-772gq\" (UniqueName: \"kubernetes.io/projected/60e1d74b-b77f-47d3-9325-9e2557bee1d7-kube-api-access-772gq\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.445978 4841 scope.go:117] "RemoveContainer" containerID="362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.448080 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.470703 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gddr9" podStartSLOduration=1.718252117 podStartE2EDuration="11.470676255s" podCreationTimestamp="2025-12-03 17:20:32 +0000 UTC" firstStartedPulling="2025-12-03 17:20:33.153068776 +0000 UTC m=+1227.540589503" lastFinishedPulling="2025-12-03 17:20:42.905492914 +0000 UTC m=+1237.293013641" observedRunningTime="2025-12-03 17:20:43.465204408 +0000 UTC m=+1237.852725125" watchObservedRunningTime="2025-12-03 17:20:43.470676255 +0000 UTC m=+1237.858196992" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.507139 4841 scope.go:117] "RemoveContainer" containerID="51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.510015 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data" (OuterVolumeSpecName: "config-data") pod "60e1d74b-b77f-47d3-9325-9e2557bee1d7" (UID: "60e1d74b-b77f-47d3-9325-9e2557bee1d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.545610 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.545641 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/60e1d74b-b77f-47d3-9325-9e2557bee1d7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.615272 4841 scope.go:117] "RemoveContainer" containerID="78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.615736 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c\": container with ID starting with 78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c not found: ID does not exist" containerID="78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.615762 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c"} err="failed to get container status \"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c\": rpc error: code = NotFound desc = could not find container \"78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c\": container with ID starting with 78441a85df5a5696a7b316ec2f927bd5cb5918a5eb6e31e66047cc6cca2b839c not found: ID does not exist" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.615784 4841 scope.go:117] "RemoveContainer" containerID="d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.616107 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6\": container with ID starting with d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6 not found: ID does not exist" containerID="d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.616122 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6"} err="failed to get container status \"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6\": rpc error: code = NotFound desc = could not find container \"d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6\": container with ID starting with d530c5254efd68170485295508901164d060d2c37e6bbe8190bfa2adaad74fc6 not found: ID does not exist" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.616135 4841 scope.go:117] "RemoveContainer" containerID="362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.616425 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84\": container with ID starting with 362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84 not found: ID does not exist" containerID="362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.616455 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84"} err="failed to get container status \"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84\": rpc error: code = NotFound desc = could not find container \"362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84\": container with ID starting with 362eb841e337e6d1d7b3064db24668da715a78ae5275a9e6f3a5be4d06e3fb84 not found: ID does not exist" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.616468 4841 scope.go:117] "RemoveContainer" containerID="51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.616661 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674\": container with ID starting with 51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674 not found: ID does not exist" containerID="51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.616682 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674"} err="failed to get container status \"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674\": rpc error: code = NotFound desc = could not find container \"51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674\": container with ID starting with 51aa60f0d3985d2ba1fc027688c8b36056c8418963f93d1d8c070aa690c54674 not found: ID does not exist" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.724672 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.737204 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.752485 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.753336 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="proxy-httpd" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.753411 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="proxy-httpd" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.753483 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-notification-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.753537 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-notification-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.753605 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-central-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.753660 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-central-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: E1203 17:20:43.753719 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="sg-core" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.753834 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="sg-core" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.754080 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-central-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.754153 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="proxy-httpd" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.754216 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="sg-core" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.754285 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" containerName="ceilometer-notification-agent" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.755853 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.758119 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.758303 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.777702 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853345 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853404 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5djlh\" (UniqueName: \"kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853448 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853505 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.853935 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.854067 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955339 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955452 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955494 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955523 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955550 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5djlh\" (UniqueName: \"kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.955587 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.956742 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.956762 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.960444 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.964266 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.964813 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.971866 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:43 crc kubenswrapper[4841]: I1203 17:20:43.977702 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5djlh\" (UniqueName: \"kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh\") pod \"ceilometer-0\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " pod="openstack/ceilometer-0" Dec 03 17:20:44 crc kubenswrapper[4841]: I1203 17:20:44.073269 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:20:44 crc kubenswrapper[4841]: I1203 17:20:44.262262 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60e1d74b-b77f-47d3-9325-9e2557bee1d7" path="/var/lib/kubelet/pods/60e1d74b-b77f-47d3-9325-9e2557bee1d7/volumes" Dec 03 17:20:44 crc kubenswrapper[4841]: W1203 17:20:44.570530 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod01203ee3_03f7_4d19_b510_f15cfc780aa5.slice/crio-20c43ce86bcca9a3aeeea5d9d6254374bc55256f8e482bdf4dd86fadd20fc779 WatchSource:0}: Error finding container 20c43ce86bcca9a3aeeea5d9d6254374bc55256f8e482bdf4dd86fadd20fc779: Status 404 returned error can't find the container with id 20c43ce86bcca9a3aeeea5d9d6254374bc55256f8e482bdf4dd86fadd20fc779 Dec 03 17:20:44 crc kubenswrapper[4841]: I1203 17:20:44.574796 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.426253 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.426616 4841 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.427632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerStarted","Data":"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774"} Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.427681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerStarted","Data":"20c43ce86bcca9a3aeeea5d9d6254374bc55256f8e482bdf4dd86fadd20fc779"} Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.652877 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:45 crc kubenswrapper[4841]: I1203 17:20:45.655142 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 17:20:46 crc kubenswrapper[4841]: E1203 17:20:46.082312 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 is running failed: container process not found" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:46 crc kubenswrapper[4841]: E1203 17:20:46.082802 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 is running failed: container process not found" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:46 crc kubenswrapper[4841]: E1203 17:20:46.083079 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 is running failed: container process not found" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Dec 03 17:20:46 crc kubenswrapper[4841]: E1203 17:20:46.083113 4841 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 is running failed: container process not found" probeType="Readiness" pod="openstack/heat-engine-7848f458c5-zfrpj" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.312140 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.436109 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data\") pod \"702ea708-3829-4dcb-9cee-8db1f6fbb715\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.436213 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle\") pod \"702ea708-3829-4dcb-9cee-8db1f6fbb715\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.436340 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkvb6\" (UniqueName: \"kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6\") pod \"702ea708-3829-4dcb-9cee-8db1f6fbb715\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.436388 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom\") pod \"702ea708-3829-4dcb-9cee-8db1f6fbb715\" (UID: \"702ea708-3829-4dcb-9cee-8db1f6fbb715\") " Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.439778 4841 generic.go:334] "Generic (PLEG): container finished" podID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" exitCode=0 Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.439861 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7848f458c5-zfrpj" event={"ID":"702ea708-3829-4dcb-9cee-8db1f6fbb715","Type":"ContainerDied","Data":"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4"} Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.439895 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-7848f458c5-zfrpj" event={"ID":"702ea708-3829-4dcb-9cee-8db1f6fbb715","Type":"ContainerDied","Data":"3d38c95eea83c06968d8eeac0aceede794f38fa4b127d0d3621a6a3cd1faa11c"} Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.439868 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-7848f458c5-zfrpj" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.439938 4841 scope.go:117] "RemoveContainer" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.444194 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "702ea708-3829-4dcb-9cee-8db1f6fbb715" (UID: "702ea708-3829-4dcb-9cee-8db1f6fbb715"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.444681 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerStarted","Data":"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b"} Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.445550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6" (OuterVolumeSpecName: "kube-api-access-kkvb6") pod "702ea708-3829-4dcb-9cee-8db1f6fbb715" (UID: "702ea708-3829-4dcb-9cee-8db1f6fbb715"). InnerVolumeSpecName "kube-api-access-kkvb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.492966 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "702ea708-3829-4dcb-9cee-8db1f6fbb715" (UID: "702ea708-3829-4dcb-9cee-8db1f6fbb715"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.500225 4841 scope.go:117] "RemoveContainer" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" Dec 03 17:20:46 crc kubenswrapper[4841]: E1203 17:20:46.503425 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4\": container with ID starting with 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 not found: ID does not exist" containerID="55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.503495 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4"} err="failed to get container status \"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4\": rpc error: code = NotFound desc = could not find container \"55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4\": container with ID starting with 55c83505735875f3b5192bbd1f4be92c600185032e2c6a28d6975ce77df0fba4 not found: ID does not exist" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.503497 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data" (OuterVolumeSpecName: "config-data") pod "702ea708-3829-4dcb-9cee-8db1f6fbb715" (UID: "702ea708-3829-4dcb-9cee-8db1f6fbb715"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.547093 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.547128 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.547139 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkvb6\" (UniqueName: \"kubernetes.io/projected/702ea708-3829-4dcb-9cee-8db1f6fbb715-kube-api-access-kkvb6\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.547149 4841 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/702ea708-3829-4dcb-9cee-8db1f6fbb715-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.769792 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:46 crc kubenswrapper[4841]: I1203 17:20:46.780045 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-7848f458c5-zfrpj"] Dec 03 17:20:47 crc kubenswrapper[4841]: I1203 17:20:47.457243 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerStarted","Data":"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18"} Dec 03 17:20:48 crc kubenswrapper[4841]: I1203 17:20:48.251866 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" path="/var/lib/kubelet/pods/702ea708-3829-4dcb-9cee-8db1f6fbb715/volumes" Dec 03 17:20:48 crc kubenswrapper[4841]: I1203 17:20:48.469180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerStarted","Data":"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac"} Dec 03 17:20:48 crc kubenswrapper[4841]: I1203 17:20:48.469404 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:20:48 crc kubenswrapper[4841]: I1203 17:20:48.501291 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.197609589 podStartE2EDuration="5.501269948s" podCreationTimestamp="2025-12-03 17:20:43 +0000 UTC" firstStartedPulling="2025-12-03 17:20:44.57300466 +0000 UTC m=+1238.960525387" lastFinishedPulling="2025-12-03 17:20:47.876665009 +0000 UTC m=+1242.264185746" observedRunningTime="2025-12-03 17:20:48.494917619 +0000 UTC m=+1242.882438346" watchObservedRunningTime="2025-12-03 17:20:48.501269948 +0000 UTC m=+1242.888790675" Dec 03 17:20:55 crc kubenswrapper[4841]: I1203 17:20:55.537775 4841 generic.go:334] "Generic (PLEG): container finished" podID="1c71fa5a-194d-4d9f-a591-c4db053a2b01" containerID="0dfa8076e353712ef5974cfe0b2ae72a1df067ccc7df6f10311934ebd56b28b4" exitCode=0 Dec 03 17:20:55 crc kubenswrapper[4841]: I1203 17:20:55.537862 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gddr9" event={"ID":"1c71fa5a-194d-4d9f-a591-c4db053a2b01","Type":"ContainerDied","Data":"0dfa8076e353712ef5974cfe0b2ae72a1df067ccc7df6f10311934ebd56b28b4"} Dec 03 17:20:56 crc kubenswrapper[4841]: I1203 17:20:56.985044 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.053005 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle\") pod \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.053190 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data\") pod \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.053678 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts\") pod \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.053746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg9x8\" (UniqueName: \"kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8\") pod \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\" (UID: \"1c71fa5a-194d-4d9f-a591-c4db053a2b01\") " Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.059147 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts" (OuterVolumeSpecName: "scripts") pod "1c71fa5a-194d-4d9f-a591-c4db053a2b01" (UID: "1c71fa5a-194d-4d9f-a591-c4db053a2b01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.059216 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8" (OuterVolumeSpecName: "kube-api-access-mg9x8") pod "1c71fa5a-194d-4d9f-a591-c4db053a2b01" (UID: "1c71fa5a-194d-4d9f-a591-c4db053a2b01"). InnerVolumeSpecName "kube-api-access-mg9x8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.078340 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1c71fa5a-194d-4d9f-a591-c4db053a2b01" (UID: "1c71fa5a-194d-4d9f-a591-c4db053a2b01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.081868 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data" (OuterVolumeSpecName: "config-data") pod "1c71fa5a-194d-4d9f-a591-c4db053a2b01" (UID: "1c71fa5a-194d-4d9f-a591-c4db053a2b01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.155979 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.156012 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg9x8\" (UniqueName: \"kubernetes.io/projected/1c71fa5a-194d-4d9f-a591-c4db053a2b01-kube-api-access-mg9x8\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.156022 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.156031 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1c71fa5a-194d-4d9f-a591-c4db053a2b01-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.564198 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gddr9" event={"ID":"1c71fa5a-194d-4d9f-a591-c4db053a2b01","Type":"ContainerDied","Data":"6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139"} Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.564490 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9a5d5653fc55433c0399de55dad8ffbef49d9b3538d80e908519170ccb2139" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.564321 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gddr9" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.708143 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 17:20:57 crc kubenswrapper[4841]: E1203 17:20:57.708594 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c71fa5a-194d-4d9f-a591-c4db053a2b01" containerName="nova-cell0-conductor-db-sync" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.708618 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c71fa5a-194d-4d9f-a591-c4db053a2b01" containerName="nova-cell0-conductor-db-sync" Dec 03 17:20:57 crc kubenswrapper[4841]: E1203 17:20:57.708640 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.708649 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.708852 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="702ea708-3829-4dcb-9cee-8db1f6fbb715" containerName="heat-engine" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.708887 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c71fa5a-194d-4d9f-a591-c4db053a2b01" containerName="nova-cell0-conductor-db-sync" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.709570 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.712204 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-dmjlh" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.712327 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.734874 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.765679 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.765782 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.765825 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbgdx\" (UniqueName: \"kubernetes.io/projected/fd4f28bc-9247-408c-a91e-94a0c739bfce-kube-api-access-gbgdx\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.867855 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.867943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbgdx\" (UniqueName: \"kubernetes.io/projected/fd4f28bc-9247-408c-a91e-94a0c739bfce-kube-api-access-gbgdx\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.868077 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.873687 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.876512 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd4f28bc-9247-408c-a91e-94a0c739bfce-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:57 crc kubenswrapper[4841]: I1203 17:20:57.889131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbgdx\" (UniqueName: \"kubernetes.io/projected/fd4f28bc-9247-408c-a91e-94a0c739bfce-kube-api-access-gbgdx\") pod \"nova-cell0-conductor-0\" (UID: \"fd4f28bc-9247-408c-a91e-94a0c739bfce\") " pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:58 crc kubenswrapper[4841]: I1203 17:20:58.067877 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:58 crc kubenswrapper[4841]: I1203 17:20:58.575999 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fd4f28bc-9247-408c-a91e-94a0c739bfce","Type":"ContainerStarted","Data":"60e97f82c8a6f7c805de7a7db4a34d226806f28c8958c3230258688c2b756d5a"} Dec 03 17:20:58 crc kubenswrapper[4841]: I1203 17:20:58.578552 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 17:20:59 crc kubenswrapper[4841]: I1203 17:20:59.587472 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"fd4f28bc-9247-408c-a91e-94a0c739bfce","Type":"ContainerStarted","Data":"c951134bd08ca48e2e1634e3875de798f89aa679267e9118011cd282db22e05f"} Dec 03 17:20:59 crc kubenswrapper[4841]: I1203 17:20:59.587810 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 17:20:59 crc kubenswrapper[4841]: I1203 17:20:59.616315 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.61628857 podStartE2EDuration="2.61628857s" podCreationTimestamp="2025-12-03 17:20:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:20:59.601843978 +0000 UTC m=+1253.989364735" watchObservedRunningTime="2025-12-03 17:20:59.61628857 +0000 UTC m=+1254.003809337" Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.863309 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.864188 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-central-agent" containerID="cri-o://34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774" gracePeriod=30 Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.864249 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="sg-core" containerID="cri-o://a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18" gracePeriod=30 Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.864288 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-notification-agent" containerID="cri-o://9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b" gracePeriod=30 Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.864369 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="proxy-httpd" containerID="cri-o://be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac" gracePeriod=30 Dec 03 17:21:00 crc kubenswrapper[4841]: I1203 17:21:00.880525 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.187:3000/\": EOF" Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608177 4841 generic.go:334] "Generic (PLEG): container finished" podID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerID="be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac" exitCode=0 Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608714 4841 generic.go:334] "Generic (PLEG): container finished" podID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerID="a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18" exitCode=2 Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608726 4841 generic.go:334] "Generic (PLEG): container finished" podID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerID="34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774" exitCode=0 Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608418 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerDied","Data":"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac"} Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerDied","Data":"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18"} Dec 03 17:21:01 crc kubenswrapper[4841]: I1203 17:21:01.608830 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerDied","Data":"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774"} Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.416599 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.454864 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5djlh\" (UniqueName: \"kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.454983 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455088 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455119 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455134 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455175 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd\") pod \"01203ee3-03f7-4d19-b510-f15cfc780aa5\" (UID: \"01203ee3-03f7-4d19-b510-f15cfc780aa5\") " Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.455950 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.456950 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.471133 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts" (OuterVolumeSpecName: "scripts") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.480535 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh" (OuterVolumeSpecName: "kube-api-access-5djlh") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "kube-api-access-5djlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.496520 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.546098 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557828 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5djlh\" (UniqueName: \"kubernetes.io/projected/01203ee3-03f7-4d19-b510-f15cfc780aa5-kube-api-access-5djlh\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557878 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557890 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557920 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557933 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.557945 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/01203ee3-03f7-4d19-b510-f15cfc780aa5-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.613998 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data" (OuterVolumeSpecName: "config-data") pod "01203ee3-03f7-4d19-b510-f15cfc780aa5" (UID: "01203ee3-03f7-4d19-b510-f15cfc780aa5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.622460 4841 generic.go:334] "Generic (PLEG): container finished" podID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerID="9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b" exitCode=0 Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.622509 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerDied","Data":"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b"} Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.622544 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"01203ee3-03f7-4d19-b510-f15cfc780aa5","Type":"ContainerDied","Data":"20c43ce86bcca9a3aeeea5d9d6254374bc55256f8e482bdf4dd86fadd20fc779"} Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.622565 4841 scope.go:117] "RemoveContainer" containerID="be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.622734 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.654152 4841 scope.go:117] "RemoveContainer" containerID="a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.659238 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01203ee3-03f7-4d19-b510-f15cfc780aa5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.679965 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.699496 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.713501 4841 scope.go:117] "RemoveContainer" containerID="9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.728195 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.729228 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="sg-core" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.729350 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="sg-core" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.729442 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-central-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.729515 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-central-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.729618 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="proxy-httpd" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.729688 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="proxy-httpd" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.729761 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-notification-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.729832 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-notification-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.746726 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-central-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.746814 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="proxy-httpd" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.746836 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="ceilometer-notification-agent" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.746854 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" containerName="sg-core" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.758073 4841 scope.go:117] "RemoveContainer" containerID="34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.774825 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.774987 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.788466 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.789095 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.826106 4841 scope.go:117] "RemoveContainer" containerID="be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.827294 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac\": container with ID starting with be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac not found: ID does not exist" containerID="be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.827350 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac"} err="failed to get container status \"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac\": rpc error: code = NotFound desc = could not find container \"be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac\": container with ID starting with be26963e8a711d612947920810981651c157f03939739fc6e75e8b69c4fc47ac not found: ID does not exist" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.827385 4841 scope.go:117] "RemoveContainer" containerID="a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.828182 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18\": container with ID starting with a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18 not found: ID does not exist" containerID="a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.828296 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18"} err="failed to get container status \"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18\": rpc error: code = NotFound desc = could not find container \"a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18\": container with ID starting with a086e1f6366c5dbf3abed8e086a041a4414a9c16fe60c229dfb206555f598b18 not found: ID does not exist" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.828352 4841 scope.go:117] "RemoveContainer" containerID="9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.828958 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b\": container with ID starting with 9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b not found: ID does not exist" containerID="9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.829002 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b"} err="failed to get container status \"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b\": rpc error: code = NotFound desc = could not find container \"9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b\": container with ID starting with 9d0efa9d38ea238a5ede338a3607f6210495b1b64308733e20f2bfe0be5bca9b not found: ID does not exist" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.829020 4841 scope.go:117] "RemoveContainer" containerID="34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774" Dec 03 17:21:02 crc kubenswrapper[4841]: E1203 17:21:02.830333 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774\": container with ID starting with 34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774 not found: ID does not exist" containerID="34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.830364 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774"} err="failed to get container status \"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774\": rpc error: code = NotFound desc = could not find container \"34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774\": container with ID starting with 34890afbfe54bf91d002455322600fa00c7694fe5478eb7b874b2c2f5d88d774 not found: ID does not exist" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888166 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888257 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888316 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888340 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888377 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888407 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.888427 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zzm8\" (UniqueName: \"kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989483 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zzm8\" (UniqueName: \"kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989595 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989692 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.989716 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.990111 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.990673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.994153 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.994597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.995076 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:02 crc kubenswrapper[4841]: I1203 17:21:02.995842 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.009108 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zzm8\" (UniqueName: \"kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8\") pod \"ceilometer-0\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " pod="openstack/ceilometer-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.094060 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.136081 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.517141 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-frcwp"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.518966 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.520875 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.522055 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.526938 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-frcwp"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.605337 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.605807 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.606146 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.606230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9k7\" (UniqueName: \"kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.662110 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.698625 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.704318 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.706733 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.707557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.707619 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.707651 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9k7\" (UniqueName: \"kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.707676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.710182 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.713051 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.716442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.722646 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.730788 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9k7\" (UniqueName: \"kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7\") pod \"nova-cell0-cell-mapping-frcwp\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.806999 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.808883 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.809163 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kxt\" (UniqueName: \"kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.809221 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.809791 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.809864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.822313 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.834754 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.842923 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.869568 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.871081 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.877816 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.893783 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.917590 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921582 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kxt\" (UniqueName: \"kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.921974 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4v94\" (UniqueName: \"kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.922077 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.926939 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.927276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.931003 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.954357 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.955539 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.962358 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kxt\" (UniqueName: \"kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt\") pod \"nova-api-0\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " pod="openstack/nova-api-0" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.966025 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 17:21:03 crc kubenswrapper[4841]: I1203 17:21:03.970422 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.020990 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.022877 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023639 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023781 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023844 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdznw\" (UniqueName: \"kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023872 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023904 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4v94\" (UniqueName: \"kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.023974 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.024014 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rp94\" (UniqueName: \"kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.024040 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.024483 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.030000 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.030456 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.045467 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.053573 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.061457 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4v94\" (UniqueName: \"kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94\") pod \"nova-metadata-0\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.126942 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127001 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127089 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vpf5\" (UniqueName: \"kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127187 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdznw\" (UniqueName: \"kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127217 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127288 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rp94\" (UniqueName: \"kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127329 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127351 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127400 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.127461 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.128894 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.133823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.135705 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.147805 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.148233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rp94\" (UniqueName: \"kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94\") pod \"nova-scheduler-0\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.150824 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdznw\" (UniqueName: \"kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.151411 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.217357 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228516 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228601 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228643 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.228697 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vpf5\" (UniqueName: \"kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.229884 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.229923 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.230632 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.234555 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.234574 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.257228 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vpf5\" (UniqueName: \"kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5\") pod \"dnsmasq-dns-9b86998b5-pwfml\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.260010 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.283319 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01203ee3-03f7-4d19-b510-f15cfc780aa5" path="/var/lib/kubelet/pods/01203ee3-03f7-4d19-b510-f15cfc780aa5/volumes" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.303480 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.472182 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-frcwp"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.656412 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-frcwp" event={"ID":"21f13e1c-c61f-4ccc-881a-036eede4e140","Type":"ContainerStarted","Data":"699dee783303919f0fd0fb08a309ebf457a91eeafda30078132e1033592a7285"} Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.659736 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerStarted","Data":"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690"} Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.659780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerStarted","Data":"dee088d2a109435581fbe1c2d2f99c403f9314b61990f5049cabda41a813d5b6"} Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.722225 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:04 crc kubenswrapper[4841]: W1203 17:21:04.731428 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd51dd53_a0e7_4655_b083_8b84b0c92a32.slice/crio-3e572606c80fb94c049d27a912a5f786b2fe61011cfd6999a334fa5136201e7c WatchSource:0}: Error finding container 3e572606c80fb94c049d27a912a5f786b2fe61011cfd6999a334fa5136201e7c: Status 404 returned error can't find the container with id 3e572606c80fb94c049d27a912a5f786b2fe61011cfd6999a334fa5136201e7c Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.738275 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.835702 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.915588 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.937446 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d5k7k"] Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.938791 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.942777 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.942925 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 17:21:04 crc kubenswrapper[4841]: I1203 17:21:04.959960 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d5k7k"] Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.021381 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:05 crc kubenswrapper[4841]: W1203 17:21:05.032246 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2deb29d0_20a5_4c0f_b97f_520223071914.slice/crio-a0e8c83eb001ff83d19ee2e070132c28816b187525ef9c6ec4820e6121c8b13b WatchSource:0}: Error finding container a0e8c83eb001ff83d19ee2e070132c28816b187525ef9c6ec4820e6121c8b13b: Status 404 returned error can't find the container with id a0e8c83eb001ff83d19ee2e070132c28816b187525ef9c6ec4820e6121c8b13b Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.048057 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.048205 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.048266 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.048298 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmd6q\" (UniqueName: \"kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.150131 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.150522 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.150563 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmd6q\" (UniqueName: \"kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.150618 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.155026 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.155057 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.163522 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.171592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmd6q\" (UniqueName: \"kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q\") pod \"nova-cell1-conductor-db-sync-d5k7k\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.273352 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.669758 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f65a151-5183-42bf-ba67-e0943727b455" containerID="5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910" exitCode=0 Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.669867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" event={"ID":"7f65a151-5183-42bf-ba67-e0943727b455","Type":"ContainerDied","Data":"5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.670161 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" event={"ID":"7f65a151-5183-42bf-ba67-e0943727b455","Type":"ContainerStarted","Data":"0f2a1844b1d868d5e1649dffffcec0903f20ca3cc9f2baffad74c3749a44e0c6"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.671407 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerStarted","Data":"3e572606c80fb94c049d27a912a5f786b2fe61011cfd6999a334fa5136201e7c"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.681152 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerStarted","Data":"3584b37bd57754f654dde84b5eb5ade8c80a69d44fa27efa466cef5011b5b63c"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.683420 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-frcwp" event={"ID":"21f13e1c-c61f-4ccc-881a-036eede4e140","Type":"ContainerStarted","Data":"d543f4ded22e1009abf6e163398e56c9a56bcbd5645826441422339c784a568d"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.684981 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2deb29d0-20a5-4c0f-b97f-520223071914","Type":"ContainerStarted","Data":"a0e8c83eb001ff83d19ee2e070132c28816b187525ef9c6ec4820e6121c8b13b"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.694494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7603d9be-4f93-41d8-9c6d-2238fb26ea01","Type":"ContainerStarted","Data":"bb9c21a82e7c00d92b07b08de41489c49ef3a004c8dcce615df34934d1a86360"} Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.732778 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-frcwp" podStartSLOduration=2.732757475 podStartE2EDuration="2.732757475s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:05.716175359 +0000 UTC m=+1260.103696086" watchObservedRunningTime="2025-12-03 17:21:05.732757475 +0000 UTC m=+1260.120278202" Dec 03 17:21:05 crc kubenswrapper[4841]: I1203 17:21:05.756297 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d5k7k"] Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.718192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerStarted","Data":"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4"} Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.721434 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" event={"ID":"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15","Type":"ContainerStarted","Data":"89e0789f1b8c8934114b13671a97a4776581e7656c70eba2d0e7d28c9ea4648c"} Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.721471 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" event={"ID":"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15","Type":"ContainerStarted","Data":"b7665de54c59435b5e0b24bdab7013133618d124c1be4266d6eeee492b3d3f46"} Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.733584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" event={"ID":"7f65a151-5183-42bf-ba67-e0943727b455","Type":"ContainerStarted","Data":"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942"} Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.733645 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.761480 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" podStartSLOduration=2.761456061 podStartE2EDuration="2.761456061s" podCreationTimestamp="2025-12-03 17:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:06.739702275 +0000 UTC m=+1261.127223002" watchObservedRunningTime="2025-12-03 17:21:06.761456061 +0000 UTC m=+1261.148976788" Dec 03 17:21:06 crc kubenswrapper[4841]: I1203 17:21:06.772236 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" podStartSLOduration=3.772216611 podStartE2EDuration="3.772216611s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:06.768081927 +0000 UTC m=+1261.155602664" watchObservedRunningTime="2025-12-03 17:21:06.772216611 +0000 UTC m=+1261.159737358" Dec 03 17:21:07 crc kubenswrapper[4841]: I1203 17:21:07.574751 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:07 crc kubenswrapper[4841]: I1203 17:21:07.671637 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:07 crc kubenswrapper[4841]: I1203 17:21:07.756053 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerStarted","Data":"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.782069 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2deb29d0-20a5-4c0f-b97f-520223071914","Type":"ContainerStarted","Data":"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.784153 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7603d9be-4f93-41d8-9c6d-2238fb26ea01","Type":"ContainerStarted","Data":"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.784209 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98" gracePeriod=30 Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.787274 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerStarted","Data":"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.787973 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.791082 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerStarted","Data":"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.791119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerStarted","Data":"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.791175 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-metadata" containerID="cri-o://18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1" gracePeriod=30 Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.791170 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-log" containerID="cri-o://a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a" gracePeriod=30 Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.801131 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerStarted","Data":"f58cc12d5bbafef3479de43eb48690f580d426a18b8f0b6f9722b001d5b8dd17"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.801175 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerStarted","Data":"c93c6aa7faa3baccc0b45bca9bf8bf0c8f11aee08bc7f1a4a863bc814ececbe5"} Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.805085 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.777849876 podStartE2EDuration="6.805070878s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="2025-12-03 17:21:05.036008145 +0000 UTC m=+1259.423528872" lastFinishedPulling="2025-12-03 17:21:09.063229147 +0000 UTC m=+1263.450749874" observedRunningTime="2025-12-03 17:21:09.802144075 +0000 UTC m=+1264.189664802" watchObservedRunningTime="2025-12-03 17:21:09.805070878 +0000 UTC m=+1264.192591605" Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.831193 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.432456594 podStartE2EDuration="7.831176013s" podCreationTimestamp="2025-12-03 17:21:02 +0000 UTC" firstStartedPulling="2025-12-03 17:21:03.664900128 +0000 UTC m=+1258.052420855" lastFinishedPulling="2025-12-03 17:21:09.063619547 +0000 UTC m=+1263.451140274" observedRunningTime="2025-12-03 17:21:09.823399168 +0000 UTC m=+1264.210919895" watchObservedRunningTime="2025-12-03 17:21:09.831176013 +0000 UTC m=+1264.218696740" Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.854767 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.534789129 podStartE2EDuration="6.854747905s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="2025-12-03 17:21:04.742961863 +0000 UTC m=+1259.130482590" lastFinishedPulling="2025-12-03 17:21:09.062920639 +0000 UTC m=+1263.450441366" observedRunningTime="2025-12-03 17:21:09.844936219 +0000 UTC m=+1264.232456946" watchObservedRunningTime="2025-12-03 17:21:09.854747905 +0000 UTC m=+1264.242268632" Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.867257 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.539914817 podStartE2EDuration="6.867241808s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="2025-12-03 17:21:04.735694401 +0000 UTC m=+1259.123215128" lastFinishedPulling="2025-12-03 17:21:09.063021392 +0000 UTC m=+1263.450542119" observedRunningTime="2025-12-03 17:21:09.864011237 +0000 UTC m=+1264.251531964" watchObservedRunningTime="2025-12-03 17:21:09.867241808 +0000 UTC m=+1264.254762535" Dec 03 17:21:09 crc kubenswrapper[4841]: I1203 17:21:09.898156 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.675244002 podStartE2EDuration="6.898141333s" podCreationTimestamp="2025-12-03 17:21:03 +0000 UTC" firstStartedPulling="2025-12-03 17:21:04.83369581 +0000 UTC m=+1259.221216537" lastFinishedPulling="2025-12-03 17:21:09.056593141 +0000 UTC m=+1263.444113868" observedRunningTime="2025-12-03 17:21:09.894934233 +0000 UTC m=+1264.282454970" watchObservedRunningTime="2025-12-03 17:21:09.898141333 +0000 UTC m=+1264.285662060" Dec 03 17:21:10 crc kubenswrapper[4841]: I1203 17:21:10.817631 4841 generic.go:334] "Generic (PLEG): container finished" podID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerID="a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a" exitCode=143 Dec 03 17:21:10 crc kubenswrapper[4841]: I1203 17:21:10.819154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerDied","Data":"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a"} Dec 03 17:21:12 crc kubenswrapper[4841]: I1203 17:21:12.845825 4841 generic.go:334] "Generic (PLEG): container finished" podID="21f13e1c-c61f-4ccc-881a-036eede4e140" containerID="d543f4ded22e1009abf6e163398e56c9a56bcbd5645826441422339c784a568d" exitCode=0 Dec 03 17:21:12 crc kubenswrapper[4841]: I1203 17:21:12.845914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-frcwp" event={"ID":"21f13e1c-c61f-4ccc-881a-036eede4e140","Type":"ContainerDied","Data":"d543f4ded22e1009abf6e163398e56c9a56bcbd5645826441422339c784a568d"} Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.031372 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.031621 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.129452 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.129498 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.218549 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.261945 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.304750 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.304782 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.327717 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.341777 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.341999 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="dnsmasq-dns" containerID="cri-o://62bdce21b8c15e4288d57a63b707dae053dd42cf9c1507221681f5355b57b67d" gracePeriod=10 Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.359119 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.441772 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle\") pod \"21f13e1c-c61f-4ccc-881a-036eede4e140\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.441879 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data\") pod \"21f13e1c-c61f-4ccc-881a-036eede4e140\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.441953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk9k7\" (UniqueName: \"kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7\") pod \"21f13e1c-c61f-4ccc-881a-036eede4e140\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.442137 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts\") pod \"21f13e1c-c61f-4ccc-881a-036eede4e140\" (UID: \"21f13e1c-c61f-4ccc-881a-036eede4e140\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.448715 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7" (OuterVolumeSpecName: "kube-api-access-jk9k7") pod "21f13e1c-c61f-4ccc-881a-036eede4e140" (UID: "21f13e1c-c61f-4ccc-881a-036eede4e140"). InnerVolumeSpecName "kube-api-access-jk9k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.450133 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts" (OuterVolumeSpecName: "scripts") pod "21f13e1c-c61f-4ccc-881a-036eede4e140" (UID: "21f13e1c-c61f-4ccc-881a-036eede4e140"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.529460 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21f13e1c-c61f-4ccc-881a-036eede4e140" (UID: "21f13e1c-c61f-4ccc-881a-036eede4e140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.544824 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk9k7\" (UniqueName: \"kubernetes.io/projected/21f13e1c-c61f-4ccc-881a-036eede4e140-kube-api-access-jk9k7\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.544855 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.544864 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.546117 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data" (OuterVolumeSpecName: "config-data") pod "21f13e1c-c61f-4ccc-881a-036eede4e140" (UID: "21f13e1c-c61f-4ccc-881a-036eede4e140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.647045 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f13e1c-c61f-4ccc-881a-036eede4e140-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.867850 4841 generic.go:334] "Generic (PLEG): container finished" podID="575f396c-3448-4351-867c-54ac4b0b211f" containerID="62bdce21b8c15e4288d57a63b707dae053dd42cf9c1507221681f5355b57b67d" exitCode=0 Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.867925 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" event={"ID":"575f396c-3448-4351-867c-54ac4b0b211f","Type":"ContainerDied","Data":"62bdce21b8c15e4288d57a63b707dae053dd42cf9c1507221681f5355b57b67d"} Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.869250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-frcwp" event={"ID":"21f13e1c-c61f-4ccc-881a-036eede4e140","Type":"ContainerDied","Data":"699dee783303919f0fd0fb08a309ebf457a91eeafda30078132e1033592a7285"} Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.869274 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="699dee783303919f0fd0fb08a309ebf457a91eeafda30078132e1033592a7285" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.869330 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-frcwp" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.873978 4841 generic.go:334] "Generic (PLEG): container finished" podID="06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" containerID="89e0789f1b8c8934114b13671a97a4776581e7656c70eba2d0e7d28c9ea4648c" exitCode=0 Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.874047 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" event={"ID":"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15","Type":"ContainerDied","Data":"89e0789f1b8c8934114b13671a97a4776581e7656c70eba2d0e7d28c9ea4648c"} Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.896942 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.933706 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.953885 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.954004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.954062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4jnb\" (UniqueName: \"kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.954132 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.954155 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.954223 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb\") pod \"575f396c-3448-4351-867c-54ac4b0b211f\" (UID: \"575f396c-3448-4351-867c-54ac4b0b211f\") " Dec 03 17:21:14 crc kubenswrapper[4841]: I1203 17:21:14.999192 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb" (OuterVolumeSpecName: "kube-api-access-b4jnb") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "kube-api-access-b4jnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.043670 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.044249 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-log" containerID="cri-o://c93c6aa7faa3baccc0b45bca9bf8bf0c8f11aee08bc7f1a4a863bc814ececbe5" gracePeriod=30 Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.044776 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-api" containerID="cri-o://f58cc12d5bbafef3479de43eb48690f580d426a18b8f0b6f9722b001d5b8dd17" gracePeriod=30 Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.057209 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4jnb\" (UniqueName: \"kubernetes.io/projected/575f396c-3448-4351-867c-54ac4b0b211f-kube-api-access-b4jnb\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.065253 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.065628 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.191:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.066519 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.069381 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.074255 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config" (OuterVolumeSpecName: "config") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.081313 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.092596 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "575f396c-3448-4351-867c-54ac4b0b211f" (UID: "575f396c-3448-4351-867c-54ac4b0b211f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.158649 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.158695 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.158708 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.158723 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.158740 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/575f396c-3448-4351-867c-54ac4b0b211f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.440426 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.901393 4841 generic.go:334] "Generic (PLEG): container finished" podID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerID="c93c6aa7faa3baccc0b45bca9bf8bf0c8f11aee08bc7f1a4a863bc814ececbe5" exitCode=143 Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.901527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerDied","Data":"c93c6aa7faa3baccc0b45bca9bf8bf0c8f11aee08bc7f1a4a863bc814ececbe5"} Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.904040 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.920275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7756b9d78c-dhr82" event={"ID":"575f396c-3448-4351-867c-54ac4b0b211f","Type":"ContainerDied","Data":"ae970e292ba5e9e7f902921d0de8717ffa3233b24373a41f8d0223f514ed1e92"} Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.920383 4841 scope.go:117] "RemoveContainer" containerID="62bdce21b8c15e4288d57a63b707dae053dd42cf9c1507221681f5355b57b67d" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.971121 4841 scope.go:117] "RemoveContainer" containerID="8ace8f01fa94626762f0165ad54fdf1464f92dc65b9896b257c24a33e9181469" Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.975944 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:21:15 crc kubenswrapper[4841]: I1203 17:21:15.986962 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7756b9d78c-dhr82"] Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.249019 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="575f396c-3448-4351-867c-54ac4b0b211f" path="/var/lib/kubelet/pods/575f396c-3448-4351-867c-54ac4b0b211f/volumes" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.336587 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.386492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle\") pod \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.386637 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data\") pod \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.386757 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmd6q\" (UniqueName: \"kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q\") pod \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.386800 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts\") pod \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\" (UID: \"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15\") " Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.416417 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts" (OuterVolumeSpecName: "scripts") pod "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" (UID: "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.416538 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q" (OuterVolumeSpecName: "kube-api-access-rmd6q") pod "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" (UID: "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15"). InnerVolumeSpecName "kube-api-access-rmd6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.483136 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data" (OuterVolumeSpecName: "config-data") pod "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" (UID: "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.487952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" (UID: "06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.489141 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmd6q\" (UniqueName: \"kubernetes.io/projected/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-kube-api-access-rmd6q\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.489176 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.489187 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.489196 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.924834 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" event={"ID":"06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15","Type":"ContainerDied","Data":"b7665de54c59435b5e0b24bdab7013133618d124c1be4266d6eeee492b3d3f46"} Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.924900 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7665de54c59435b5e0b24bdab7013133618d124c1be4266d6eeee492b3d3f46" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.924857 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-d5k7k" Dec 03 17:21:16 crc kubenswrapper[4841]: I1203 17:21:16.924981 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" containerName="nova-scheduler-scheduler" containerID="cri-o://946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" gracePeriod=30 Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.007666 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 17:21:17 crc kubenswrapper[4841]: E1203 17:21:17.008270 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" containerName="nova-cell1-conductor-db-sync" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008295 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" containerName="nova-cell1-conductor-db-sync" Dec 03 17:21:17 crc kubenswrapper[4841]: E1203 17:21:17.008335 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="dnsmasq-dns" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008343 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="dnsmasq-dns" Dec 03 17:21:17 crc kubenswrapper[4841]: E1203 17:21:17.008354 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21f13e1c-c61f-4ccc-881a-036eede4e140" containerName="nova-manage" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008362 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="21f13e1c-c61f-4ccc-881a-036eede4e140" containerName="nova-manage" Dec 03 17:21:17 crc kubenswrapper[4841]: E1203 17:21:17.008380 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="init" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008388 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="init" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008610 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="575f396c-3448-4351-867c-54ac4b0b211f" containerName="dnsmasq-dns" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008630 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" containerName="nova-cell1-conductor-db-sync" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.008645 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="21f13e1c-c61f-4ccc-881a-036eede4e140" containerName="nova-manage" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.009434 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.011942 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.012447 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.112849 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qm9t\" (UniqueName: \"kubernetes.io/projected/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-kube-api-access-7qm9t\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.113180 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.113256 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.214833 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qm9t\" (UniqueName: \"kubernetes.io/projected/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-kube-api-access-7qm9t\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.215008 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.215091 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.221755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.222672 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.234122 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qm9t\" (UniqueName: \"kubernetes.io/projected/de4fa66a-9d61-40a5-97f9-4c7841e1ca58-kube-api-access-7qm9t\") pod \"nova-cell1-conductor-0\" (UID: \"de4fa66a-9d61-40a5-97f9-4c7841e1ca58\") " pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.331520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:17 crc kubenswrapper[4841]: W1203 17:21:17.786218 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde4fa66a_9d61_40a5_97f9_4c7841e1ca58.slice/crio-8f7415deb7b38f8156a46e942cf9e48d686cc614f6fc6bd99272e160a64350a6 WatchSource:0}: Error finding container 8f7415deb7b38f8156a46e942cf9e48d686cc614f6fc6bd99272e160a64350a6: Status 404 returned error can't find the container with id 8f7415deb7b38f8156a46e942cf9e48d686cc614f6fc6bd99272e160a64350a6 Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.788590 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 17:21:17 crc kubenswrapper[4841]: I1203 17:21:17.935826 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"de4fa66a-9d61-40a5-97f9-4c7841e1ca58","Type":"ContainerStarted","Data":"8f7415deb7b38f8156a46e942cf9e48d686cc614f6fc6bd99272e160a64350a6"} Dec 03 17:21:18 crc kubenswrapper[4841]: I1203 17:21:18.952249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"de4fa66a-9d61-40a5-97f9-4c7841e1ca58","Type":"ContainerStarted","Data":"aeff7f3ecb3b4b9cac372c23f59373d3d3e0da64ca6c09a0bff7b1376bf64dc5"} Dec 03 17:21:18 crc kubenswrapper[4841]: I1203 17:21:18.953482 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:18 crc kubenswrapper[4841]: I1203 17:21:18.988531 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.988216015 podStartE2EDuration="2.988216015s" podCreationTimestamp="2025-12-03 17:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:18.980240445 +0000 UTC m=+1273.367761172" watchObservedRunningTime="2025-12-03 17:21:18.988216015 +0000 UTC m=+1273.375736782" Dec 03 17:21:19 crc kubenswrapper[4841]: E1203 17:21:19.306500 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 17:21:19 crc kubenswrapper[4841]: E1203 17:21:19.307707 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 17:21:19 crc kubenswrapper[4841]: E1203 17:21:19.308664 4841 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 17:21:19 crc kubenswrapper[4841]: E1203 17:21:19.308708 4841 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" containerName="nova-scheduler-scheduler" Dec 03 17:21:20 crc kubenswrapper[4841]: I1203 17:21:20.981244 4841 generic.go:334] "Generic (PLEG): container finished" podID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerID="f58cc12d5bbafef3479de43eb48690f580d426a18b8f0b6f9722b001d5b8dd17" exitCode=0 Dec 03 17:21:20 crc kubenswrapper[4841]: I1203 17:21:20.981350 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerDied","Data":"f58cc12d5bbafef3479de43eb48690f580d426a18b8f0b6f9722b001d5b8dd17"} Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.538656 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.614729 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle\") pod \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.614778 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kxt\" (UniqueName: \"kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt\") pod \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.614850 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs\") pod \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.614965 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data\") pod \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\" (UID: \"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.621111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt" (OuterVolumeSpecName: "kube-api-access-b7kxt") pod "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" (UID: "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38"). InnerVolumeSpecName "kube-api-access-b7kxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.633706 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs" (OuterVolumeSpecName: "logs") pod "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" (UID: "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.648149 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data" (OuterVolumeSpecName: "config-data") pod "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" (UID: "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.648317 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" (UID: "65a8b295-9fbd-4ecc-a944-0ee41dd6dc38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.718219 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.718564 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.718577 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kxt\" (UniqueName: \"kubernetes.io/projected/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-kube-api-access-b7kxt\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.718589 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.744898 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.820024 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data\") pod \"2deb29d0-20a5-4c0f-b97f-520223071914\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.820081 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle\") pod \"2deb29d0-20a5-4c0f-b97f-520223071914\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.820201 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6rp94\" (UniqueName: \"kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94\") pod \"2deb29d0-20a5-4c0f-b97f-520223071914\" (UID: \"2deb29d0-20a5-4c0f-b97f-520223071914\") " Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.827381 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94" (OuterVolumeSpecName: "kube-api-access-6rp94") pod "2deb29d0-20a5-4c0f-b97f-520223071914" (UID: "2deb29d0-20a5-4c0f-b97f-520223071914"). InnerVolumeSpecName "kube-api-access-6rp94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.852589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2deb29d0-20a5-4c0f-b97f-520223071914" (UID: "2deb29d0-20a5-4c0f-b97f-520223071914"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.852887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data" (OuterVolumeSpecName: "config-data") pod "2deb29d0-20a5-4c0f-b97f-520223071914" (UID: "2deb29d0-20a5-4c0f-b97f-520223071914"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.923048 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.923091 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2deb29d0-20a5-4c0f-b97f-520223071914-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:21 crc kubenswrapper[4841]: I1203 17:21:21.923105 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6rp94\" (UniqueName: \"kubernetes.io/projected/2deb29d0-20a5-4c0f-b97f-520223071914-kube-api-access-6rp94\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.004897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"65a8b295-9fbd-4ecc-a944-0ee41dd6dc38","Type":"ContainerDied","Data":"3584b37bd57754f654dde84b5eb5ade8c80a69d44fa27efa466cef5011b5b63c"} Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.004987 4841 scope.go:117] "RemoveContainer" containerID="f58cc12d5bbafef3479de43eb48690f580d426a18b8f0b6f9722b001d5b8dd17" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.005003 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.006902 4841 generic.go:334] "Generic (PLEG): container finished" podID="2deb29d0-20a5-4c0f-b97f-520223071914" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" exitCode=0 Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.006961 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2deb29d0-20a5-4c0f-b97f-520223071914","Type":"ContainerDied","Data":"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc"} Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.006987 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.006995 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2deb29d0-20a5-4c0f-b97f-520223071914","Type":"ContainerDied","Data":"a0e8c83eb001ff83d19ee2e070132c28816b187525ef9c6ec4820e6121c8b13b"} Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.037949 4841 scope.go:117] "RemoveContainer" containerID="c93c6aa7faa3baccc0b45bca9bf8bf0c8f11aee08bc7f1a4a863bc814ececbe5" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.048053 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.060967 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.062483 4841 scope.go:117] "RemoveContainer" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.073657 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.082605 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.093849 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094115 4841 scope.go:117] "RemoveContainer" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" Dec 03 17:21:22 crc kubenswrapper[4841]: E1203 17:21:22.094302 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-log" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094324 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-log" Dec 03 17:21:22 crc kubenswrapper[4841]: E1203 17:21:22.094347 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" containerName="nova-scheduler-scheduler" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094355 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" containerName="nova-scheduler-scheduler" Dec 03 17:21:22 crc kubenswrapper[4841]: E1203 17:21:22.094371 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-api" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094379 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-api" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094615 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" containerName="nova-scheduler-scheduler" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094653 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-api" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.094664 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" containerName="nova-api-log" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.095891 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: E1203 17:21:22.096257 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc\": container with ID starting with 946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc not found: ID does not exist" containerID="946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.096308 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc"} err="failed to get container status \"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc\": rpc error: code = NotFound desc = could not find container \"946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc\": container with ID starting with 946e788269efe77732586488a98a96255d9e4e2dac86a7126e5754d5cb1953dc not found: ID does not exist" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.103095 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.104668 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.105445 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.111785 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.112757 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.120940 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.127073 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.127281 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw8s7\" (UniqueName: \"kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.127365 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.127468 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.228978 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.229037 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrsp9\" (UniqueName: \"kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.229058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.229184 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw8s7\" (UniqueName: \"kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.229214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.229267 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.230129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.230524 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.233584 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.238151 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.251397 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2deb29d0-20a5-4c0f-b97f-520223071914" path="/var/lib/kubelet/pods/2deb29d0-20a5-4c0f-b97f-520223071914/volumes" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.252519 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a8b295-9fbd-4ecc-a944-0ee41dd6dc38" path="/var/lib/kubelet/pods/65a8b295-9fbd-4ecc-a944-0ee41dd6dc38/volumes" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.253322 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw8s7\" (UniqueName: \"kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7\") pod \"nova-api-0\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.331778 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.331884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrsp9\" (UniqueName: \"kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.331956 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.336750 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.337633 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.351474 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrsp9\" (UniqueName: \"kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9\") pod \"nova-scheduler-0\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.366645 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.425838 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.434515 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.878896 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:22 crc kubenswrapper[4841]: W1203 17:21:22.884130 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fb922a7_bf26_43b6_bd77_70af112434c7.slice/crio-3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba WatchSource:0}: Error finding container 3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba: Status 404 returned error can't find the container with id 3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba Dec 03 17:21:22 crc kubenswrapper[4841]: I1203 17:21:22.990839 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:23 crc kubenswrapper[4841]: I1203 17:21:23.021115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerStarted","Data":"3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba"} Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.050466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a","Type":"ContainerStarted","Data":"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4"} Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.050933 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a","Type":"ContainerStarted","Data":"02592d52b35000ec8d802e47d1af661678b557d7084cb84e692b79b4f5a29b09"} Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.058357 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerStarted","Data":"edc6416659e53ba054e285f97f8cd4e0c3d23a1dc3157d2fcafb981d1241604d"} Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.058441 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerStarted","Data":"766e6cef81d2558032d38deae49e281c5935af3c5b93907b90f0703155b28582"} Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.101162 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.101135514 podStartE2EDuration="2.101135514s" podCreationTimestamp="2025-12-03 17:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:24.086758733 +0000 UTC m=+1278.474279500" watchObservedRunningTime="2025-12-03 17:21:24.101135514 +0000 UTC m=+1278.488656281" Dec 03 17:21:24 crc kubenswrapper[4841]: I1203 17:21:24.123626 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.123604597 podStartE2EDuration="2.123604597s" podCreationTimestamp="2025-12-03 17:21:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:24.123587007 +0000 UTC m=+1278.511107774" watchObservedRunningTime="2025-12-03 17:21:24.123604597 +0000 UTC m=+1278.511125334" Dec 03 17:21:27 crc kubenswrapper[4841]: I1203 17:21:27.434725 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 17:21:32 crc kubenswrapper[4841]: I1203 17:21:32.426219 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:21:32 crc kubenswrapper[4841]: I1203 17:21:32.426895 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:21:32 crc kubenswrapper[4841]: I1203 17:21:32.434713 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 17:21:32 crc kubenswrapper[4841]: I1203 17:21:32.480554 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 17:21:33 crc kubenswrapper[4841]: I1203 17:21:33.159471 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 17:21:33 crc kubenswrapper[4841]: I1203 17:21:33.208767 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 17:21:33 crc kubenswrapper[4841]: I1203 17:21:33.510954 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:33 crc kubenswrapper[4841]: I1203 17:21:33.511393 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.198:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.181015 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.181407 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="800c114f-56e0-4bb3-8b43-f6b2f623584a" containerName="kube-state-metrics" containerID="cri-o://841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1" gracePeriod=30 Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.661143 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.780652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcc7c\" (UniqueName: \"kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c\") pod \"800c114f-56e0-4bb3-8b43-f6b2f623584a\" (UID: \"800c114f-56e0-4bb3-8b43-f6b2f623584a\") " Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.792107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c" (OuterVolumeSpecName: "kube-api-access-qcc7c") pod "800c114f-56e0-4bb3-8b43-f6b2f623584a" (UID: "800c114f-56e0-4bb3-8b43-f6b2f623584a"). InnerVolumeSpecName "kube-api-access-qcc7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:37 crc kubenswrapper[4841]: I1203 17:21:37.883203 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcc7c\" (UniqueName: \"kubernetes.io/projected/800c114f-56e0-4bb3-8b43-f6b2f623584a-kube-api-access-qcc7c\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.216722 4841 generic.go:334] "Generic (PLEG): container finished" podID="800c114f-56e0-4bb3-8b43-f6b2f623584a" containerID="841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1" exitCode=2 Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.216846 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.216857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"800c114f-56e0-4bb3-8b43-f6b2f623584a","Type":"ContainerDied","Data":"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1"} Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.217197 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"800c114f-56e0-4bb3-8b43-f6b2f623584a","Type":"ContainerDied","Data":"078193a9c68c2825a492f0f3731d568c122c6169ee4e9c8a6581834848783336"} Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.217223 4841 scope.go:117] "RemoveContainer" containerID="841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.251328 4841 scope.go:117] "RemoveContainer" containerID="841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1" Dec 03 17:21:38 crc kubenswrapper[4841]: E1203 17:21:38.254807 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1\": container with ID starting with 841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1 not found: ID does not exist" containerID="841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.254881 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1"} err="failed to get container status \"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1\": rpc error: code = NotFound desc = could not find container \"841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1\": container with ID starting with 841c25d3ccc687b1971c1be83772f2f50bd0961f4185507c59736afb2ba47cd1 not found: ID does not exist" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.262287 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.268065 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.280205 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:38 crc kubenswrapper[4841]: E1203 17:21:38.280711 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="800c114f-56e0-4bb3-8b43-f6b2f623584a" containerName="kube-state-metrics" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.280729 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="800c114f-56e0-4bb3-8b43-f6b2f623584a" containerName="kube-state-metrics" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.280907 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="800c114f-56e0-4bb3-8b43-f6b2f623584a" containerName="kube-state-metrics" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.281569 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.286331 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.286338 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.290331 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.393987 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.394058 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.394096 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/18b7d958-8083-489d-ab83-9cc342dbad71-kube-api-access-n5g96\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.394125 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.495862 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.495935 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.495973 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/18b7d958-8083-489d-ab83-9cc342dbad71-kube-api-access-n5g96\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.495997 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.501225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.501390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.502250 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/18b7d958-8083-489d-ab83-9cc342dbad71-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.516885 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g96\" (UniqueName: \"kubernetes.io/projected/18b7d958-8083-489d-ab83-9cc342dbad71-kube-api-access-n5g96\") pod \"kube-state-metrics-0\" (UID: \"18b7d958-8083-489d-ab83-9cc342dbad71\") " pod="openstack/kube-state-metrics-0" Dec 03 17:21:38 crc kubenswrapper[4841]: I1203 17:21:38.605350 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.098837 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.099458 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-central-agent" containerID="cri-o://2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690" gracePeriod=30 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.099515 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="sg-core" containerID="cri-o://90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae" gracePeriod=30 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.099610 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-notification-agent" containerID="cri-o://de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4" gracePeriod=30 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.099977 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="proxy-httpd" containerID="cri-o://072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8" gracePeriod=30 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.160321 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.166180 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.228996 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerID="072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8" exitCode=0 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.229284 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerID="90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae" exitCode=2 Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.229075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerDied","Data":"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8"} Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.229318 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerDied","Data":"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae"} Dec 03 17:21:39 crc kubenswrapper[4841]: I1203 17:21:39.230351 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"18b7d958-8083-489d-ab83-9cc342dbad71","Type":"ContainerStarted","Data":"4d8a802ea102fda98b2a11edc9478857573ad6b77a9f4c2586df86e36fa4a9ae"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.126689 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.181344 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229176 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdznw\" (UniqueName: \"kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw\") pod \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229311 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data\") pod \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229356 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs\") pod \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229455 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data\") pod \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229489 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle\") pod \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229510 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle\") pod \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\" (UID: \"7603d9be-4f93-41d8-9c6d-2238fb26ea01\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229573 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4v94\" (UniqueName: \"kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94\") pod \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\" (UID: \"fd51dd53-a0e7-4655-b083-8b84b0c92a32\") " Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.229778 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs" (OuterVolumeSpecName: "logs") pod "fd51dd53-a0e7-4655-b083-8b84b0c92a32" (UID: "fd51dd53-a0e7-4655-b083-8b84b0c92a32"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.230434 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd51dd53-a0e7-4655-b083-8b84b0c92a32-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.234155 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94" (OuterVolumeSpecName: "kube-api-access-m4v94") pod "fd51dd53-a0e7-4655-b083-8b84b0c92a32" (UID: "fd51dd53-a0e7-4655-b083-8b84b0c92a32"). InnerVolumeSpecName "kube-api-access-m4v94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.239568 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw" (OuterVolumeSpecName: "kube-api-access-kdznw") pod "7603d9be-4f93-41d8-9c6d-2238fb26ea01" (UID: "7603d9be-4f93-41d8-9c6d-2238fb26ea01"). InnerVolumeSpecName "kube-api-access-kdznw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.240880 4841 generic.go:334] "Generic (PLEG): container finished" podID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerID="18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1" exitCode=137 Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.240978 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.244505 4841 generic.go:334] "Generic (PLEG): container finished" podID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" containerID="23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98" exitCode=137 Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.244621 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.247979 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerID="2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690" exitCode=0 Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.258107 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data" (OuterVolumeSpecName: "config-data") pod "7603d9be-4f93-41d8-9c6d-2238fb26ea01" (UID: "7603d9be-4f93-41d8-9c6d-2238fb26ea01"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.258760 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="800c114f-56e0-4bb3-8b43-f6b2f623584a" path="/var/lib/kubelet/pods/800c114f-56e0-4bb3-8b43-f6b2f623584a/volumes" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.266789 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd51dd53-a0e7-4655-b083-8b84b0c92a32" (UID: "fd51dd53-a0e7-4655-b083-8b84b0c92a32"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.268601 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.905519415 podStartE2EDuration="2.268578709s" podCreationTimestamp="2025-12-03 17:21:38 +0000 UTC" firstStartedPulling="2025-12-03 17:21:39.165969324 +0000 UTC m=+1293.553490051" lastFinishedPulling="2025-12-03 17:21:39.529028608 +0000 UTC m=+1293.916549345" observedRunningTime="2025-12-03 17:21:40.258723025 +0000 UTC m=+1294.646243752" watchObservedRunningTime="2025-12-03 17:21:40.268578709 +0000 UTC m=+1294.656099436" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270196 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270240 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerDied","Data":"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270262 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fd51dd53-a0e7-4655-b083-8b84b0c92a32","Type":"ContainerDied","Data":"3e572606c80fb94c049d27a912a5f786b2fe61011cfd6999a334fa5136201e7c"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270273 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"18b7d958-8083-489d-ab83-9cc342dbad71","Type":"ContainerStarted","Data":"26ec1227b65e59b0eaa3057a88bb3b20b813d30754b2979a99127cadee5f392a"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270283 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7603d9be-4f93-41d8-9c6d-2238fb26ea01","Type":"ContainerDied","Data":"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7603d9be-4f93-41d8-9c6d-2238fb26ea01","Type":"ContainerDied","Data":"bb9c21a82e7c00d92b07b08de41489c49ef3a004c8dcce615df34934d1a86360"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerDied","Data":"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690"} Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270324 4841 scope.go:117] "RemoveContainer" containerID="18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.270815 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7603d9be-4f93-41d8-9c6d-2238fb26ea01" (UID: "7603d9be-4f93-41d8-9c6d-2238fb26ea01"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.275550 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data" (OuterVolumeSpecName: "config-data") pod "fd51dd53-a0e7-4655-b083-8b84b0c92a32" (UID: "fd51dd53-a0e7-4655-b083-8b84b0c92a32"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.302395 4841 scope.go:117] "RemoveContainer" containerID="a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.317863 4841 scope.go:117] "RemoveContainer" containerID="18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1" Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.318231 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1\": container with ID starting with 18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1 not found: ID does not exist" containerID="18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.318262 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1"} err="failed to get container status \"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1\": rpc error: code = NotFound desc = could not find container \"18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1\": container with ID starting with 18e6045358aaea0bee2bfae5c879cd933e4f0379b0a42347a972a686aecd72c1 not found: ID does not exist" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.318286 4841 scope.go:117] "RemoveContainer" containerID="a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a" Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.318765 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a\": container with ID starting with a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a not found: ID does not exist" containerID="a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.318819 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a"} err="failed to get container status \"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a\": rpc error: code = NotFound desc = could not find container \"a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a\": container with ID starting with a8a37b750e600bdabbfa195d59fb237f385d57ce0f49782ca0732fd8a9b3f05a not found: ID does not exist" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.318878 4841 scope.go:117] "RemoveContainer" containerID="23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333201 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333311 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd51dd53-a0e7-4655-b083-8b84b0c92a32-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333328 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333339 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4v94\" (UniqueName: \"kubernetes.io/projected/fd51dd53-a0e7-4655-b083-8b84b0c92a32-kube-api-access-m4v94\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333351 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdznw\" (UniqueName: \"kubernetes.io/projected/7603d9be-4f93-41d8-9c6d-2238fb26ea01-kube-api-access-kdznw\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.333363 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7603d9be-4f93-41d8-9c6d-2238fb26ea01-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.337330 4841 scope.go:117] "RemoveContainer" containerID="23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98" Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.337891 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98\": container with ID starting with 23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98 not found: ID does not exist" containerID="23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.337946 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98"} err="failed to get container status \"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98\": rpc error: code = NotFound desc = could not find container \"23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98\": container with ID starting with 23aca8c97cc73f5747340163acd0e724f2fe99a07bd90d75b59fcb7758820e98 not found: ID does not exist" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.625385 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.636352 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.653743 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.662761 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.663490 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.663522 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.663577 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-metadata" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.663592 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-metadata" Dec 03 17:21:40 crc kubenswrapper[4841]: E1203 17:21:40.663635 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-log" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.663650 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-log" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.664081 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.664140 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-log" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.664174 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" containerName="nova-metadata-metadata" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.665255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.668597 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.669363 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.669552 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.678924 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.700980 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.710613 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.712589 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.716887 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.716998 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.734677 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.741763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.741809 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.741843 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.741868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz67p\" (UniqueName: \"kubernetes.io/projected/187b2155-68e5-419d-b438-e22374486ae8-kube-api-access-cz67p\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742036 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742171 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cstv4\" (UniqueName: \"kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742230 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742282 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.742323 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cstv4\" (UniqueName: \"kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844190 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844238 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844539 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844599 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844653 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844675 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cz67p\" (UniqueName: \"kubernetes.io/projected/187b2155-68e5-419d-b438-e22374486ae8-kube-api-access-cz67p\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.844751 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.845179 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.849610 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.849808 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.849878 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.850060 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.850483 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.850640 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.862428 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/187b2155-68e5-419d-b438-e22374486ae8-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.863225 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz67p\" (UniqueName: \"kubernetes.io/projected/187b2155-68e5-419d-b438-e22374486ae8-kube-api-access-cz67p\") pod \"nova-cell1-novncproxy-0\" (UID: \"187b2155-68e5-419d-b438-e22374486ae8\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.880267 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cstv4\" (UniqueName: \"kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4\") pod \"nova-metadata-0\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " pod="openstack/nova-metadata-0" Dec 03 17:21:40 crc kubenswrapper[4841]: I1203 17:21:40.985362 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:41 crc kubenswrapper[4841]: I1203 17:21:41.027577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:21:41 crc kubenswrapper[4841]: I1203 17:21:41.548703 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 17:21:41 crc kubenswrapper[4841]: W1203 17:21:41.599645 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf0a3ff45_7f6c_4780_8821_2f46d98d23f9.slice/crio-aef552f29293469c302e96e85d42e4a696ef39727c39d48556025d175c539bc4 WatchSource:0}: Error finding container aef552f29293469c302e96e85d42e4a696ef39727c39d48556025d175c539bc4: Status 404 returned error can't find the container with id aef552f29293469c302e96e85d42e4a696ef39727c39d48556025d175c539bc4 Dec 03 17:21:41 crc kubenswrapper[4841]: I1203 17:21:41.603305 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.247812 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7603d9be-4f93-41d8-9c6d-2238fb26ea01" path="/var/lib/kubelet/pods/7603d9be-4f93-41d8-9c6d-2238fb26ea01/volumes" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.249042 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd51dd53-a0e7-4655-b083-8b84b0c92a32" path="/var/lib/kubelet/pods/fd51dd53-a0e7-4655-b083-8b84b0c92a32/volumes" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.276654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerStarted","Data":"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db"} Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.276694 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerStarted","Data":"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91"} Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.276706 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerStarted","Data":"aef552f29293469c302e96e85d42e4a696ef39727c39d48556025d175c539bc4"} Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.278194 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"187b2155-68e5-419d-b438-e22374486ae8","Type":"ContainerStarted","Data":"9a10efc1af88bd3cd912d81e2175590567596fed2097892c8cc47d9d685e4a3e"} Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.278219 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"187b2155-68e5-419d-b438-e22374486ae8","Type":"ContainerStarted","Data":"0ab86ba52f1d4f4701c79375d94b26070b4110861bbf5394a9175a3a2b0cdde6"} Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.298595 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.298579029 podStartE2EDuration="2.298579029s" podCreationTimestamp="2025-12-03 17:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:42.29821475 +0000 UTC m=+1296.685735477" watchObservedRunningTime="2025-12-03 17:21:42.298579029 +0000 UTC m=+1296.686099756" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.320868 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.320849522 podStartE2EDuration="2.320849522s" podCreationTimestamp="2025-12-03 17:21:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:42.315735635 +0000 UTC m=+1296.703256362" watchObservedRunningTime="2025-12-03 17:21:42.320849522 +0000 UTC m=+1296.708370259" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.430532 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.431385 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.431757 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.433696 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.781785 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.888826 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.889260 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.889382 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.889581 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.889828 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.890062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.890382 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zzm8\" (UniqueName: \"kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.890504 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.890631 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle\") pod \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\" (UID: \"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9\") " Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.891666 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.892188 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.894314 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts" (OuterVolumeSpecName: "scripts") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.910275 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8" (OuterVolumeSpecName: "kube-api-access-7zzm8") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "kube-api-access-7zzm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.928642 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.994671 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.994900 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:42 crc kubenswrapper[4841]: I1203 17:21:42.996011 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zzm8\" (UniqueName: \"kubernetes.io/projected/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-kube-api-access-7zzm8\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.003203 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.031111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data" (OuterVolumeSpecName: "config-data") pod "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" (UID: "8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.100570 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.100598 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.290262 4841 generic.go:334] "Generic (PLEG): container finished" podID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerID="de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4" exitCode=0 Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.290720 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerDied","Data":"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4"} Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.290852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9","Type":"ContainerDied","Data":"dee088d2a109435581fbe1c2d2f99c403f9314b61990f5049cabda41a813d5b6"} Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.291002 4841 scope.go:117] "RemoveContainer" containerID="072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.291069 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.291934 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.295941 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.340521 4841 scope.go:117] "RemoveContainer" containerID="90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.355560 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.380937 4841 scope.go:117] "RemoveContainer" containerID="de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.386021 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.401625 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.402101 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-central-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402113 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-central-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.402141 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="proxy-httpd" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402147 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="proxy-httpd" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.402172 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="sg-core" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402178 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="sg-core" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.402190 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-notification-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402195 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-notification-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402382 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-central-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402392 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="proxy-httpd" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402399 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="sg-core" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.402418 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" containerName="ceilometer-notification-agent" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.404055 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.409530 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.410253 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.414166 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.418089 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.425514 4841 scope.go:117] "RemoveContainer" containerID="2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.460166 4841 scope.go:117] "RemoveContainer" containerID="072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.465050 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8\": container with ID starting with 072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8 not found: ID does not exist" containerID="072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.465096 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8"} err="failed to get container status \"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8\": rpc error: code = NotFound desc = could not find container \"072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8\": container with ID starting with 072edc955ea53dde04e6f26493bddf152970608ed5ac8ce0ddfc5bb64b4782b8 not found: ID does not exist" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.465121 4841 scope.go:117] "RemoveContainer" containerID="90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.472948 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae\": container with ID starting with 90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae not found: ID does not exist" containerID="90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.472991 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae"} err="failed to get container status \"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae\": rpc error: code = NotFound desc = could not find container \"90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae\": container with ID starting with 90e27f77924a8d98b5b12b9f23355d13b634efa77ab435bd00df3df679cf23ae not found: ID does not exist" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.473021 4841 scope.go:117] "RemoveContainer" containerID="de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.483205 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4\": container with ID starting with de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4 not found: ID does not exist" containerID="de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.483254 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4"} err="failed to get container status \"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4\": rpc error: code = NotFound desc = could not find container \"de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4\": container with ID starting with de88463a9130657738d0eca74000cb44aeb07ca0d84574f59e0725ce84a1a5f4 not found: ID does not exist" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.483278 4841 scope.go:117] "RemoveContainer" containerID="2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690" Dec 03 17:21:43 crc kubenswrapper[4841]: E1203 17:21:43.488046 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690\": container with ID starting with 2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690 not found: ID does not exist" containerID="2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.488096 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690"} err="failed to get container status \"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690\": rpc error: code = NotFound desc = could not find container \"2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690\": container with ID starting with 2583a43b151dd8493056b2cf3c943b4c2091096719aa7e7203a7018215656690 not found: ID does not exist" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.492932 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.494460 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.503452 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507049 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507123 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507148 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpwz\" (UniqueName: \"kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507192 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507279 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507305 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.507336 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.608835 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmt82\" (UniqueName: \"kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.608886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.608940 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.608965 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.608988 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609020 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609040 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609065 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609100 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609118 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609133 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpwz\" (UniqueName: \"kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609161 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609226 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.609626 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.610340 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.614345 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.615993 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.617800 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.625717 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.638332 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpwz\" (UniqueName: \"kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.640689 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts\") pod \"ceilometer-0\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.710746 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.710989 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmt82\" (UniqueName: \"kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.711116 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.711164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.711230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.711294 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.711723 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.712252 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.713721 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.713754 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.713832 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.726803 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.735544 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmt82\" (UniqueName: \"kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82\") pod \"dnsmasq-dns-6b7bbf7cf9-gmd9m\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:43 crc kubenswrapper[4841]: I1203 17:21:43.864676 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:44 crc kubenswrapper[4841]: I1203 17:21:44.250050 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9" path="/var/lib/kubelet/pods/8a3f78f5-6a8b-4295-ac5f-6e8d02ec38a9/volumes" Dec 03 17:21:44 crc kubenswrapper[4841]: I1203 17:21:44.273752 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:44 crc kubenswrapper[4841]: W1203 17:21:44.276276 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda89d8591_d9e2_4398_b8b5_93e2095a0ab2.slice/crio-9b31286562afbe6956a05af6da5bbf84687f64b9ea8bcc8809652e7a596b6027 WatchSource:0}: Error finding container 9b31286562afbe6956a05af6da5bbf84687f64b9ea8bcc8809652e7a596b6027: Status 404 returned error can't find the container with id 9b31286562afbe6956a05af6da5bbf84687f64b9ea8bcc8809652e7a596b6027 Dec 03 17:21:44 crc kubenswrapper[4841]: I1203 17:21:44.301589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerStarted","Data":"9b31286562afbe6956a05af6da5bbf84687f64b9ea8bcc8809652e7a596b6027"} Dec 03 17:21:44 crc kubenswrapper[4841]: W1203 17:21:44.388049 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ab11961_383b_4fe3_bdc2_fe78e71617c0.slice/crio-9a505b2359b05d83d0894725a5f78889ac24403722037bd1dca85cae163fefc0 WatchSource:0}: Error finding container 9a505b2359b05d83d0894725a5f78889ac24403722037bd1dca85cae163fefc0: Status 404 returned error can't find the container with id 9a505b2359b05d83d0894725a5f78889ac24403722037bd1dca85cae163fefc0 Dec 03 17:21:44 crc kubenswrapper[4841]: I1203 17:21:44.388366 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:21:44 crc kubenswrapper[4841]: I1203 17:21:44.951950 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:45 crc kubenswrapper[4841]: I1203 17:21:45.309399 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerID="b18efab15d61ef2a77fc178364f05549dbdcecbb400909bacb200d1a7317f68b" exitCode=0 Dec 03 17:21:45 crc kubenswrapper[4841]: I1203 17:21:45.309466 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" event={"ID":"9ab11961-383b-4fe3-bdc2-fe78e71617c0","Type":"ContainerDied","Data":"b18efab15d61ef2a77fc178364f05549dbdcecbb400909bacb200d1a7317f68b"} Dec 03 17:21:45 crc kubenswrapper[4841]: I1203 17:21:45.309756 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" event={"ID":"9ab11961-383b-4fe3-bdc2-fe78e71617c0","Type":"ContainerStarted","Data":"9a505b2359b05d83d0894725a5f78889ac24403722037bd1dca85cae163fefc0"} Dec 03 17:21:45 crc kubenswrapper[4841]: I1203 17:21:45.312079 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerStarted","Data":"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64"} Dec 03 17:21:45 crc kubenswrapper[4841]: I1203 17:21:45.985612 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.028580 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.028657 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.332217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" event={"ID":"9ab11961-383b-4fe3-bdc2-fe78e71617c0","Type":"ContainerStarted","Data":"2003532e07052a42e74569cc276039097a7e0d21abc8b1e946a39b48361f0990"} Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.332932 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.345614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerStarted","Data":"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b"} Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.375022 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" podStartSLOduration=3.375006065 podStartE2EDuration="3.375006065s" podCreationTimestamp="2025-12-03 17:21:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:46.367772546 +0000 UTC m=+1300.755293273" watchObservedRunningTime="2025-12-03 17:21:46.375006065 +0000 UTC m=+1300.762526792" Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.880505 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.881121 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-log" containerID="cri-o://766e6cef81d2558032d38deae49e281c5935af3c5b93907b90f0703155b28582" gracePeriod=30 Dec 03 17:21:46 crc kubenswrapper[4841]: I1203 17:21:46.881181 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-api" containerID="cri-o://edc6416659e53ba054e285f97f8cd4e0c3d23a1dc3157d2fcafb981d1241604d" gracePeriod=30 Dec 03 17:21:47 crc kubenswrapper[4841]: I1203 17:21:47.355836 4841 generic.go:334] "Generic (PLEG): container finished" podID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerID="766e6cef81d2558032d38deae49e281c5935af3c5b93907b90f0703155b28582" exitCode=143 Dec 03 17:21:47 crc kubenswrapper[4841]: I1203 17:21:47.355932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerDied","Data":"766e6cef81d2558032d38deae49e281c5935af3c5b93907b90f0703155b28582"} Dec 03 17:21:47 crc kubenswrapper[4841]: I1203 17:21:47.358445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerStarted","Data":"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727"} Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.369703 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerStarted","Data":"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a"} Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.370346 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-central-agent" containerID="cri-o://aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64" gracePeriod=30 Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.370594 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.370882 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="proxy-httpd" containerID="cri-o://fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a" gracePeriod=30 Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.370943 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="sg-core" containerID="cri-o://dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727" gracePeriod=30 Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.370976 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-notification-agent" containerID="cri-o://2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b" gracePeriod=30 Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.406106 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.131869492 podStartE2EDuration="5.406092292s" podCreationTimestamp="2025-12-03 17:21:43 +0000 UTC" firstStartedPulling="2025-12-03 17:21:44.278772202 +0000 UTC m=+1298.666292929" lastFinishedPulling="2025-12-03 17:21:47.552995002 +0000 UTC m=+1301.940515729" observedRunningTime="2025-12-03 17:21:48.398013822 +0000 UTC m=+1302.785534549" watchObservedRunningTime="2025-12-03 17:21:48.406092292 +0000 UTC m=+1302.793613019" Dec 03 17:21:48 crc kubenswrapper[4841]: I1203 17:21:48.615154 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380429 4841 generic.go:334] "Generic (PLEG): container finished" podID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerID="fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a" exitCode=0 Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380739 4841 generic.go:334] "Generic (PLEG): container finished" podID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerID="dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727" exitCode=2 Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380752 4841 generic.go:334] "Generic (PLEG): container finished" podID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerID="2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b" exitCode=0 Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerDied","Data":"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a"} Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380794 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerDied","Data":"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727"} Dec 03 17:21:49 crc kubenswrapper[4841]: I1203 17:21:49.380811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerDied","Data":"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b"} Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.417443 4841 generic.go:334] "Generic (PLEG): container finished" podID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerID="edc6416659e53ba054e285f97f8cd4e0c3d23a1dc3157d2fcafb981d1241604d" exitCode=0 Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.417702 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerDied","Data":"edc6416659e53ba054e285f97f8cd4e0c3d23a1dc3157d2fcafb981d1241604d"} Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.417730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5fb922a7-bf26-43b6-bd77-70af112434c7","Type":"ContainerDied","Data":"3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba"} Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.417742 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3929ffb1b16b346bb92f2fcffd97846d8007de1bf4b4658d1d2f67108a289eba" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.446025 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.580301 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw8s7\" (UniqueName: \"kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7\") pod \"5fb922a7-bf26-43b6-bd77-70af112434c7\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.580360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data\") pod \"5fb922a7-bf26-43b6-bd77-70af112434c7\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.580403 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs\") pod \"5fb922a7-bf26-43b6-bd77-70af112434c7\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.580419 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle\") pod \"5fb922a7-bf26-43b6-bd77-70af112434c7\" (UID: \"5fb922a7-bf26-43b6-bd77-70af112434c7\") " Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.581354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs" (OuterVolumeSpecName: "logs") pod "5fb922a7-bf26-43b6-bd77-70af112434c7" (UID: "5fb922a7-bf26-43b6-bd77-70af112434c7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.594129 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7" (OuterVolumeSpecName: "kube-api-access-fw8s7") pod "5fb922a7-bf26-43b6-bd77-70af112434c7" (UID: "5fb922a7-bf26-43b6-bd77-70af112434c7"). InnerVolumeSpecName "kube-api-access-fw8s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.615102 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fb922a7-bf26-43b6-bd77-70af112434c7" (UID: "5fb922a7-bf26-43b6-bd77-70af112434c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.621237 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data" (OuterVolumeSpecName: "config-data") pod "5fb922a7-bf26-43b6-bd77-70af112434c7" (UID: "5fb922a7-bf26-43b6-bd77-70af112434c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.684333 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw8s7\" (UniqueName: \"kubernetes.io/projected/5fb922a7-bf26-43b6-bd77-70af112434c7-kube-api-access-fw8s7\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.684363 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.684377 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5fb922a7-bf26-43b6-bd77-70af112434c7-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.684389 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fb922a7-bf26-43b6-bd77-70af112434c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:50 crc kubenswrapper[4841]: I1203 17:21:50.986678 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.020641 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.028520 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.065676 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.424374 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.445603 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.466397 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.482918 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.515513 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:51 crc kubenswrapper[4841]: E1203 17:21:51.516093 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-api" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.516117 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-api" Dec 03 17:21:51 crc kubenswrapper[4841]: E1203 17:21:51.516154 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-log" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.516163 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-log" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.516378 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-log" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.516405 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" containerName="nova-api-api" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.517637 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.520889 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.523806 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.523827 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.523860 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702071 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702113 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702132 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tfm\" (UniqueName: \"kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702152 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702199 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.702287 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.703932 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-v74jz"] Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.705183 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.708592 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.708873 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.722007 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v74jz"] Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804051 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804179 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804223 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804284 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804321 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804346 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804370 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67tfm\" (UniqueName: \"kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804392 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf52n\" (UniqueName: \"kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804455 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.804977 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.809603 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.810398 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.810822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.819082 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.819648 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tfm\" (UniqueName: \"kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm\") pod \"nova-api-0\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.842752 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.906158 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.906231 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf52n\" (UniqueName: \"kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.906282 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.906348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.915607 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.927575 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.929312 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:51 crc kubenswrapper[4841]: I1203 17:21:51.938374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf52n\" (UniqueName: \"kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n\") pod \"nova-cell1-cell-mapping-v74jz\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.019329 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.080154 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.080437 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.202:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.249634 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fb922a7-bf26-43b6-bd77-70af112434c7" path="/var/lib/kubelet/pods/5fb922a7-bf26-43b6-bd77-70af112434c7/volumes" Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.477197 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:52 crc kubenswrapper[4841]: W1203 17:21:52.485564 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7cf2d6f_67f9_4764_8a3f_b54082dca105.slice/crio-afa9e70afd372cc4e4eb44df0975840d2a1c2a1d8a9a50960fa9834f8a2916af WatchSource:0}: Error finding container afa9e70afd372cc4e4eb44df0975840d2a1c2a1d8a9a50960fa9834f8a2916af: Status 404 returned error can't find the container with id afa9e70afd372cc4e4eb44df0975840d2a1c2a1d8a9a50960fa9834f8a2916af Dec 03 17:21:52 crc kubenswrapper[4841]: I1203 17:21:52.573622 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-v74jz"] Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.468784 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v74jz" event={"ID":"126147ee-3dab-46a0-81c9-5e1e2793cd26","Type":"ContainerStarted","Data":"3551c145eab7c0b36bc436335bc010966277c44494769c98ee30cc072d035b37"} Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.469239 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v74jz" event={"ID":"126147ee-3dab-46a0-81c9-5e1e2793cd26","Type":"ContainerStarted","Data":"00fbbf2eedf70787a219eab30e52a7340727656cc614a1cc8bfdab4dd5d4a58f"} Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.472186 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerStarted","Data":"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894"} Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.472218 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerStarted","Data":"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a"} Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.472228 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerStarted","Data":"afa9e70afd372cc4e4eb44df0975840d2a1c2a1d8a9a50960fa9834f8a2916af"} Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.495501 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-v74jz" podStartSLOduration=2.4954835380000002 podStartE2EDuration="2.495483538s" podCreationTimestamp="2025-12-03 17:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:53.490180937 +0000 UTC m=+1307.877701664" watchObservedRunningTime="2025-12-03 17:21:53.495483538 +0000 UTC m=+1307.883004265" Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.509333 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.509319142 podStartE2EDuration="2.509319142s" podCreationTimestamp="2025-12-03 17:21:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:21:53.507302332 +0000 UTC m=+1307.894823069" watchObservedRunningTime="2025-12-03 17:21:53.509319142 +0000 UTC m=+1307.896839869" Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.864175 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.867131 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.968199 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:53 crc kubenswrapper[4841]: I1203 17:21:53.968426 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="dnsmasq-dns" containerID="cri-o://1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942" gracePeriod=10 Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.054699 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.054996 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwpwz\" (UniqueName: \"kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055180 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055254 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055278 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055470 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055610 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd\") pod \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\" (UID: \"a89d8591-d9e2-4398-b8b5-93e2095a0ab2\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.055952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.056726 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.056827 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.065859 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts" (OuterVolumeSpecName: "scripts") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.066465 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz" (OuterVolumeSpecName: "kube-api-access-hwpwz") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "kube-api-access-hwpwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.110746 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.116047 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.159847 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwpwz\" (UniqueName: \"kubernetes.io/projected/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-kube-api-access-hwpwz\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.159873 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.159883 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.159891 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.184676 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.193355 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data" (OuterVolumeSpecName: "config-data") pod "a89d8591-d9e2-4398-b8b5-93e2095a0ab2" (UID: "a89d8591-d9e2-4398-b8b5-93e2095a0ab2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.260685 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.260705 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a89d8591-d9e2-4398-b8b5-93e2095a0ab2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.441343 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.484628 4841 generic.go:334] "Generic (PLEG): container finished" podID="7f65a151-5183-42bf-ba67-e0943727b455" containerID="1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942" exitCode=0 Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.484721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" event={"ID":"7f65a151-5183-42bf-ba67-e0943727b455","Type":"ContainerDied","Data":"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942"} Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.484884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" event={"ID":"7f65a151-5183-42bf-ba67-e0943727b455","Type":"ContainerDied","Data":"0f2a1844b1d868d5e1649dffffcec0903f20ca3cc9f2baffad74c3749a44e0c6"} Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.484933 4841 scope.go:117] "RemoveContainer" containerID="1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.484765 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.492163 4841 generic.go:334] "Generic (PLEG): container finished" podID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerID="aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64" exitCode=0 Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.492974 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerDied","Data":"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64"} Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.493062 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a89d8591-d9e2-4398-b8b5-93e2095a0ab2","Type":"ContainerDied","Data":"9b31286562afbe6956a05af6da5bbf84687f64b9ea8bcc8809652e7a596b6027"} Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.494032 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.511149 4841 scope.go:117] "RemoveContainer" containerID="5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.537327 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.563670 4841 scope.go:117] "RemoveContainer" containerID="1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.564094 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942\": container with ID starting with 1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942 not found: ID does not exist" containerID="1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.564119 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942"} err="failed to get container status \"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942\": rpc error: code = NotFound desc = could not find container \"1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942\": container with ID starting with 1983197f6b0651b44f74436d051cf07b7476f79f1266b39070390e950e42c942 not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.564141 4841 scope.go:117] "RemoveContainer" containerID="5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.564230 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.565228 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910\": container with ID starting with 5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910 not found: ID does not exist" containerID="5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.565255 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910"} err="failed to get container status \"5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910\": rpc error: code = NotFound desc = could not find container \"5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910\": container with ID starting with 5dfbd341e6a07cd20756bcfb56478f21ec2a4f18fc22f786dfaa57ebe17e4910 not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.565273 4841 scope.go:117] "RemoveContainer" containerID="fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566490 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566557 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vpf5\" (UniqueName: \"kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566705 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566726 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566751 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.566897 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb\") pod \"7f65a151-5183-42bf-ba67-e0943727b455\" (UID: \"7f65a151-5183-42bf-ba67-e0943727b455\") " Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.579607 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5" (OuterVolumeSpecName: "kube-api-access-6vpf5") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "kube-api-access-6vpf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.589799 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591643 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="dnsmasq-dns" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591664 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="dnsmasq-dns" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591691 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-central-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591698 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-central-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591719 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="sg-core" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591729 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="sg-core" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591742 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="proxy-httpd" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591749 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="proxy-httpd" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591759 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-notification-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591764 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-notification-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.591773 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="init" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591779 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="init" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591956 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="dnsmasq-dns" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591970 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="proxy-httpd" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591981 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-central-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.591992 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="sg-core" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.592001 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" containerName="ceilometer-notification-agent" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.593785 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.597926 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.598083 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.598189 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.601812 4841 scope.go:117] "RemoveContainer" containerID="dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.612548 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.628552 4841 scope.go:117] "RemoveContainer" containerID="2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.642976 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.643291 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config" (OuterVolumeSpecName: "config") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.651524 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.654860 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.657272 4841 scope.go:117] "RemoveContainer" containerID="aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.660396 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7f65a151-5183-42bf-ba67-e0943727b455" (UID: "7f65a151-5183-42bf-ba67-e0943727b455"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669466 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669495 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669504 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vpf5\" (UniqueName: \"kubernetes.io/projected/7f65a151-5183-42bf-ba67-e0943727b455-kube-api-access-6vpf5\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669515 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669523 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.669533 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f65a151-5183-42bf-ba67-e0943727b455-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.675301 4841 scope.go:117] "RemoveContainer" containerID="fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.675722 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a\": container with ID starting with fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a not found: ID does not exist" containerID="fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.675753 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a"} err="failed to get container status \"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a\": rpc error: code = NotFound desc = could not find container \"fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a\": container with ID starting with fdb3a1d621a23e9728ae2e21bdf935aee89161710f0047e6e6949aff5b5a2a8a not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.675773 4841 scope.go:117] "RemoveContainer" containerID="dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.676309 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727\": container with ID starting with dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727 not found: ID does not exist" containerID="dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.676356 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727"} err="failed to get container status \"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727\": rpc error: code = NotFound desc = could not find container \"dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727\": container with ID starting with dc2c5ae0634120250bb0418b2e64bab5e8bc4846ac9fe99ef6ecbdf860156727 not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.676387 4841 scope.go:117] "RemoveContainer" containerID="2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.676804 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b\": container with ID starting with 2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b not found: ID does not exist" containerID="2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.676830 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b"} err="failed to get container status \"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b\": rpc error: code = NotFound desc = could not find container \"2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b\": container with ID starting with 2951f879413d844c65a99fad79d2e0a831b6e6b8d8eca018627ce085dea2f19b not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.676844 4841 scope.go:117] "RemoveContainer" containerID="aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64" Dec 03 17:21:54 crc kubenswrapper[4841]: E1203 17:21:54.677176 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64\": container with ID starting with aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64 not found: ID does not exist" containerID="aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.677201 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64"} err="failed to get container status \"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64\": rpc error: code = NotFound desc = could not find container \"aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64\": container with ID starting with aef4f826f2eba0a06c54336739fb5e7e11e32b0f3030c57452e23f97c7af8a64 not found: ID does not exist" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.771025 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.771715 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772072 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772169 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772232 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fzrg\" (UniqueName: \"kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772307 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772571 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.772612 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.820164 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.825938 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9b86998b5-pwfml"] Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874614 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874676 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874792 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fzrg\" (UniqueName: \"kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874849 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.874897 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.876007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.876102 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.878860 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.880019 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.880053 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.881950 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.882727 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.896549 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fzrg\" (UniqueName: \"kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg\") pod \"ceilometer-0\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " pod="openstack/ceilometer-0" Dec 03 17:21:54 crc kubenswrapper[4841]: I1203 17:21:54.922109 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:21:55 crc kubenswrapper[4841]: W1203 17:21:55.419456 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8fd024bf_ffe0_4569_ab22_9c6ddecb1431.slice/crio-c3f086d46ccf36ba96e648b5469cabf2af85bc86b3dd3f9a3f83fe1dc3abaf10 WatchSource:0}: Error finding container c3f086d46ccf36ba96e648b5469cabf2af85bc86b3dd3f9a3f83fe1dc3abaf10: Status 404 returned error can't find the container with id c3f086d46ccf36ba96e648b5469cabf2af85bc86b3dd3f9a3f83fe1dc3abaf10 Dec 03 17:21:55 crc kubenswrapper[4841]: I1203 17:21:55.422994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:21:55 crc kubenswrapper[4841]: I1203 17:21:55.503153 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerStarted","Data":"c3f086d46ccf36ba96e648b5469cabf2af85bc86b3dd3f9a3f83fe1dc3abaf10"} Dec 03 17:21:56 crc kubenswrapper[4841]: I1203 17:21:56.266680 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f65a151-5183-42bf-ba67-e0943727b455" path="/var/lib/kubelet/pods/7f65a151-5183-42bf-ba67-e0943727b455/volumes" Dec 03 17:21:56 crc kubenswrapper[4841]: I1203 17:21:56.268388 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a89d8591-d9e2-4398-b8b5-93e2095a0ab2" path="/var/lib/kubelet/pods/a89d8591-d9e2-4398-b8b5-93e2095a0ab2/volumes" Dec 03 17:21:56 crc kubenswrapper[4841]: I1203 17:21:56.515433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerStarted","Data":"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f"} Dec 03 17:21:57 crc kubenswrapper[4841]: I1203 17:21:57.528101 4841 generic.go:334] "Generic (PLEG): container finished" podID="126147ee-3dab-46a0-81c9-5e1e2793cd26" containerID="3551c145eab7c0b36bc436335bc010966277c44494769c98ee30cc072d035b37" exitCode=0 Dec 03 17:21:57 crc kubenswrapper[4841]: I1203 17:21:57.528155 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v74jz" event={"ID":"126147ee-3dab-46a0-81c9-5e1e2793cd26","Type":"ContainerDied","Data":"3551c145eab7c0b36bc436335bc010966277c44494769c98ee30cc072d035b37"} Dec 03 17:21:57 crc kubenswrapper[4841]: I1203 17:21:57.531723 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerStarted","Data":"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a"} Dec 03 17:21:57 crc kubenswrapper[4841]: I1203 17:21:57.531760 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerStarted","Data":"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2"} Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.003024 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.158827 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle\") pod \"126147ee-3dab-46a0-81c9-5e1e2793cd26\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.158933 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts\") pod \"126147ee-3dab-46a0-81c9-5e1e2793cd26\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.159058 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf52n\" (UniqueName: \"kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n\") pod \"126147ee-3dab-46a0-81c9-5e1e2793cd26\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.159139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data\") pod \"126147ee-3dab-46a0-81c9-5e1e2793cd26\" (UID: \"126147ee-3dab-46a0-81c9-5e1e2793cd26\") " Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.163251 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n" (OuterVolumeSpecName: "kube-api-access-qf52n") pod "126147ee-3dab-46a0-81c9-5e1e2793cd26" (UID: "126147ee-3dab-46a0-81c9-5e1e2793cd26"). InnerVolumeSpecName "kube-api-access-qf52n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.163668 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts" (OuterVolumeSpecName: "scripts") pod "126147ee-3dab-46a0-81c9-5e1e2793cd26" (UID: "126147ee-3dab-46a0-81c9-5e1e2793cd26"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.184677 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126147ee-3dab-46a0-81c9-5e1e2793cd26" (UID: "126147ee-3dab-46a0-81c9-5e1e2793cd26"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.205843 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data" (OuterVolumeSpecName: "config-data") pod "126147ee-3dab-46a0-81c9-5e1e2793cd26" (UID: "126147ee-3dab-46a0-81c9-5e1e2793cd26"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.260867 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-9b86998b5-pwfml" podUID="7f65a151-5183-42bf-ba67-e0943727b455" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.195:5353: i/o timeout" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.263843 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.263902 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf52n\" (UniqueName: \"kubernetes.io/projected/126147ee-3dab-46a0-81c9-5e1e2793cd26-kube-api-access-qf52n\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.263944 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.263964 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126147ee-3dab-46a0-81c9-5e1e2793cd26-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.590964 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-v74jz" event={"ID":"126147ee-3dab-46a0-81c9-5e1e2793cd26","Type":"ContainerDied","Data":"00fbbf2eedf70787a219eab30e52a7340727656cc614a1cc8bfdab4dd5d4a58f"} Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.591057 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00fbbf2eedf70787a219eab30e52a7340727656cc614a1cc8bfdab4dd5d4a58f" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.591023 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-v74jz" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.593869 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerStarted","Data":"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817"} Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.595167 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.628143 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.5503385830000003 podStartE2EDuration="5.628124836s" podCreationTimestamp="2025-12-03 17:21:54 +0000 UTC" firstStartedPulling="2025-12-03 17:21:55.421850245 +0000 UTC m=+1309.809370972" lastFinishedPulling="2025-12-03 17:21:58.499636458 +0000 UTC m=+1312.887157225" observedRunningTime="2025-12-03 17:21:59.611924084 +0000 UTC m=+1313.999444811" watchObservedRunningTime="2025-12-03 17:21:59.628124836 +0000 UTC m=+1314.015645563" Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.849028 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.849484 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-log" containerID="cri-o://312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" gracePeriod=30 Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.849627 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-api" containerID="cri-o://0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" gracePeriod=30 Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.860019 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.860463 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" containerName="nova-scheduler-scheduler" containerID="cri-o://73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4" gracePeriod=30 Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.893958 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.894739 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-log" containerID="cri-o://5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91" gracePeriod=30 Dec 03 17:21:59 crc kubenswrapper[4841]: I1203 17:21:59.894845 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-metadata" containerID="cri-o://11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db" gracePeriod=30 Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.420964 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590040 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590129 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590161 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67tfm\" (UniqueName: \"kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590194 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590279 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590495 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs\") pod \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\" (UID: \"e7cf2d6f-67f9-4764-8a3f-b54082dca105\") " Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.590543 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs" (OuterVolumeSpecName: "logs") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.594449 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e7cf2d6f-67f9-4764-8a3f-b54082dca105-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.595479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm" (OuterVolumeSpecName: "kube-api-access-67tfm") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "kube-api-access-67tfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.606483 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerID="5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91" exitCode=143 Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.606584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerDied","Data":"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91"} Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609082 4841 generic.go:334] "Generic (PLEG): container finished" podID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerID="0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" exitCode=0 Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609199 4841 generic.go:334] "Generic (PLEG): container finished" podID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerID="312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" exitCode=143 Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609399 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerDied","Data":"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894"} Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609464 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerDied","Data":"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a"} Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609480 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e7cf2d6f-67f9-4764-8a3f-b54082dca105","Type":"ContainerDied","Data":"afa9e70afd372cc4e4eb44df0975840d2a1c2a1d8a9a50960fa9834f8a2916af"} Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609500 4841 scope.go:117] "RemoveContainer" containerID="0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.609805 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.628445 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data" (OuterVolumeSpecName: "config-data") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.629228 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.673048 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.682289 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e7cf2d6f-67f9-4764-8a3f-b54082dca105" (UID: "e7cf2d6f-67f9-4764-8a3f-b54082dca105"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.696718 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.696756 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.696768 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.696780 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67tfm\" (UniqueName: \"kubernetes.io/projected/e7cf2d6f-67f9-4764-8a3f-b54082dca105-kube-api-access-67tfm\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.696795 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7cf2d6f-67f9-4764-8a3f-b54082dca105-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.756664 4841 scope.go:117] "RemoveContainer" containerID="312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.779294 4841 scope.go:117] "RemoveContainer" containerID="0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" Dec 03 17:22:00 crc kubenswrapper[4841]: E1203 17:22:00.780046 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894\": container with ID starting with 0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894 not found: ID does not exist" containerID="0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780091 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894"} err="failed to get container status \"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894\": rpc error: code = NotFound desc = could not find container \"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894\": container with ID starting with 0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894 not found: ID does not exist" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780117 4841 scope.go:117] "RemoveContainer" containerID="312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" Dec 03 17:22:00 crc kubenswrapper[4841]: E1203 17:22:00.780376 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a\": container with ID starting with 312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a not found: ID does not exist" containerID="312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780410 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a"} err="failed to get container status \"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a\": rpc error: code = NotFound desc = could not find container \"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a\": container with ID starting with 312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a not found: ID does not exist" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780429 4841 scope.go:117] "RemoveContainer" containerID="0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780815 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894"} err="failed to get container status \"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894\": rpc error: code = NotFound desc = could not find container \"0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894\": container with ID starting with 0c440ef234c5264de123077431c55a99d304957cff50d1729e4d257626b99894 not found: ID does not exist" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.780861 4841 scope.go:117] "RemoveContainer" containerID="312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.781223 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a"} err="failed to get container status \"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a\": rpc error: code = NotFound desc = could not find container \"312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a\": container with ID starting with 312d4657212d5e143503e6e861a7a35f20e16bdf79c28ef98814d0110c7cd16a not found: ID does not exist" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.944433 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.955010 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.967931 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 17:22:00 crc kubenswrapper[4841]: E1203 17:22:00.968356 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126147ee-3dab-46a0-81c9-5e1e2793cd26" containerName="nova-manage" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968371 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="126147ee-3dab-46a0-81c9-5e1e2793cd26" containerName="nova-manage" Dec 03 17:22:00 crc kubenswrapper[4841]: E1203 17:22:00.968395 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-log" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968402 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-log" Dec 03 17:22:00 crc kubenswrapper[4841]: E1203 17:22:00.968427 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-api" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968434 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-api" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968613 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-api" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968625 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" containerName="nova-api-log" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.968647 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="126147ee-3dab-46a0-81c9-5e1e2793cd26" containerName="nova-manage" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.969723 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.973107 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.973232 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 17:22:00 crc kubenswrapper[4841]: I1203 17:22:00.973302 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:00.998496 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.021674 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.021788 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv2gh\" (UniqueName: \"kubernetes.io/projected/4399b120-7a3b-430f-ad42-21a2c9bd0af5-kube-api-access-bv2gh\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.021884 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4399b120-7a3b-430f-ad42-21a2c9bd0af5-logs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.021960 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-config-data\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.022113 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-public-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.022153 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.123809 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.123880 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv2gh\" (UniqueName: \"kubernetes.io/projected/4399b120-7a3b-430f-ad42-21a2c9bd0af5-kube-api-access-bv2gh\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.123951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4399b120-7a3b-430f-ad42-21a2c9bd0af5-logs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.123978 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-config-data\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.124017 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-public-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.124034 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.125123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4399b120-7a3b-430f-ad42-21a2c9bd0af5-logs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.129961 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.130135 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.130785 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-config-data\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.132892 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4399b120-7a3b-430f-ad42-21a2c9bd0af5-public-tls-certs\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.148378 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv2gh\" (UniqueName: \"kubernetes.io/projected/4399b120-7a3b-430f-ad42-21a2c9bd0af5-kube-api-access-bv2gh\") pod \"nova-api-0\" (UID: \"4399b120-7a3b-430f-ad42-21a2c9bd0af5\") " pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.330538 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.504366 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.619020 4841 generic.go:334] "Generic (PLEG): container finished" podID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" containerID="73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4" exitCode=0 Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.619077 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.619109 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a","Type":"ContainerDied","Data":"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4"} Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.619158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a","Type":"ContainerDied","Data":"02592d52b35000ec8d802e47d1af661678b557d7084cb84e692b79b4f5a29b09"} Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.619176 4841 scope.go:117] "RemoveContainer" containerID="73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.633938 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrsp9\" (UniqueName: \"kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9\") pod \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.634049 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle\") pod \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.634090 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data\") pod \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\" (UID: \"95ce34c2-7044-4e6f-a58c-0c2504e7bb2a\") " Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.641576 4841 scope.go:117] "RemoveContainer" containerID="73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.641640 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9" (OuterVolumeSpecName: "kube-api-access-rrsp9") pod "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" (UID: "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a"). InnerVolumeSpecName "kube-api-access-rrsp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:01 crc kubenswrapper[4841]: E1203 17:22:01.641985 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4\": container with ID starting with 73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4 not found: ID does not exist" containerID="73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.642022 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4"} err="failed to get container status \"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4\": rpc error: code = NotFound desc = could not find container \"73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4\": container with ID starting with 73a36090884066cc02c667eb64ebc15a7bc18b0e4973f91be28fe45de3d6f9f4 not found: ID does not exist" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.662444 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" (UID: "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.664450 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data" (OuterVolumeSpecName: "config-data") pod "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" (UID: "95ce34c2-7044-4e6f-a58c-0c2504e7bb2a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.736722 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.736751 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrsp9\" (UniqueName: \"kubernetes.io/projected/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-kube-api-access-rrsp9\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.736761 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.821728 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.965136 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:22:01 crc kubenswrapper[4841]: I1203 17:22:01.973914 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.029558 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:22:02 crc kubenswrapper[4841]: E1203 17:22:02.031077 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" containerName="nova-scheduler-scheduler" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.031116 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" containerName="nova-scheduler-scheduler" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.032750 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" containerName="nova-scheduler-scheduler" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.033657 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.038732 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.046915 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.047829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-config-data\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.047884 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.047938 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pck55\" (UniqueName: \"kubernetes.io/projected/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-kube-api-access-pck55\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.149520 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-config-data\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.149577 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.149610 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pck55\" (UniqueName: \"kubernetes.io/projected/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-kube-api-access-pck55\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.152749 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-config-data\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.161482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.165023 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pck55\" (UniqueName: \"kubernetes.io/projected/2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0-kube-api-access-pck55\") pod \"nova-scheduler-0\" (UID: \"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0\") " pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.255650 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95ce34c2-7044-4e6f-a58c-0c2504e7bb2a" path="/var/lib/kubelet/pods/95ce34c2-7044-4e6f-a58c-0c2504e7bb2a/volumes" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.256769 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7cf2d6f-67f9-4764-8a3f-b54082dca105" path="/var/lib/kubelet/pods/e7cf2d6f-67f9-4764-8a3f-b54082dca105/volumes" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.366458 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.631560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4399b120-7a3b-430f-ad42-21a2c9bd0af5","Type":"ContainerStarted","Data":"308d68707da58931640cfe67f887d17d1d4d694214949000de2d30b79fe53d11"} Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.632017 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4399b120-7a3b-430f-ad42-21a2c9bd0af5","Type":"ContainerStarted","Data":"dbe6a5f8295a47caaa92f7797f58e7310d0bd135cd8eeb1d82b5d1a994b2b877"} Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.632035 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4399b120-7a3b-430f-ad42-21a2c9bd0af5","Type":"ContainerStarted","Data":"c027820c2dca929d9656a9929a03bf5f2c674e0498350342915d31afbc8d9b53"} Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.659378 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.659360364 podStartE2EDuration="2.659360364s" podCreationTimestamp="2025-12-03 17:22:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:22:02.648283309 +0000 UTC m=+1317.035804056" watchObservedRunningTime="2025-12-03 17:22:02.659360364 +0000 UTC m=+1317.046881091" Dec 03 17:22:02 crc kubenswrapper[4841]: I1203 17:22:02.811328 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.344016 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.371595 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs\") pod \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.371668 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data\") pod \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.371746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cstv4\" (UniqueName: \"kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4\") pod \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.372072 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs\") pod \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.372167 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle\") pod \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\" (UID: \"f0a3ff45-7f6c-4780-8821-2f46d98d23f9\") " Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.373460 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs" (OuterVolumeSpecName: "logs") pod "f0a3ff45-7f6c-4780-8821-2f46d98d23f9" (UID: "f0a3ff45-7f6c-4780-8821-2f46d98d23f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.379257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4" (OuterVolumeSpecName: "kube-api-access-cstv4") pod "f0a3ff45-7f6c-4780-8821-2f46d98d23f9" (UID: "f0a3ff45-7f6c-4780-8821-2f46d98d23f9"). InnerVolumeSpecName "kube-api-access-cstv4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.402957 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data" (OuterVolumeSpecName: "config-data") pod "f0a3ff45-7f6c-4780-8821-2f46d98d23f9" (UID: "f0a3ff45-7f6c-4780-8821-2f46d98d23f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.404807 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0a3ff45-7f6c-4780-8821-2f46d98d23f9" (UID: "f0a3ff45-7f6c-4780-8821-2f46d98d23f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.451391 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "f0a3ff45-7f6c-4780-8821-2f46d98d23f9" (UID: "f0a3ff45-7f6c-4780-8821-2f46d98d23f9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.473672 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.473706 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.473722 4841 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-logs\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.473731 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.473741 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cstv4\" (UniqueName: \"kubernetes.io/projected/f0a3ff45-7f6c-4780-8821-2f46d98d23f9-kube-api-access-cstv4\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.647326 4841 generic.go:334] "Generic (PLEG): container finished" podID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerID="11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db" exitCode=0 Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.647363 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.647390 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerDied","Data":"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db"} Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.647445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f0a3ff45-7f6c-4780-8821-2f46d98d23f9","Type":"ContainerDied","Data":"aef552f29293469c302e96e85d42e4a696ef39727c39d48556025d175c539bc4"} Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.647466 4841 scope.go:117] "RemoveContainer" containerID="11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.651138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0","Type":"ContainerStarted","Data":"8156404c7d0a908d21079e1da5de685fc4a0f867d2ea2cab6753b4957a649d38"} Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.651229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0","Type":"ContainerStarted","Data":"f489ff9ca97bbdb4a284f2735acdb8c62739c6c7104539062fe12a75ef946a1f"} Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.676758 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.676735182 podStartE2EDuration="2.676735182s" podCreationTimestamp="2025-12-03 17:22:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:22:03.675981024 +0000 UTC m=+1318.063501781" watchObservedRunningTime="2025-12-03 17:22:03.676735182 +0000 UTC m=+1318.064255929" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.679219 4841 scope.go:117] "RemoveContainer" containerID="5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.719791 4841 scope.go:117] "RemoveContainer" containerID="11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db" Dec 03 17:22:03 crc kubenswrapper[4841]: E1203 17:22:03.725329 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db\": container with ID starting with 11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db not found: ID does not exist" containerID="11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.725387 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db"} err="failed to get container status \"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db\": rpc error: code = NotFound desc = could not find container \"11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db\": container with ID starting with 11aaea323e234b39922c5a3205d9190299a6e4a38a3041b6f46200ef412c58db not found: ID does not exist" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.725414 4841 scope.go:117] "RemoveContainer" containerID="5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91" Dec 03 17:22:03 crc kubenswrapper[4841]: E1203 17:22:03.726165 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91\": container with ID starting with 5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91 not found: ID does not exist" containerID="5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.726209 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91"} err="failed to get container status \"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91\": rpc error: code = NotFound desc = could not find container \"5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91\": container with ID starting with 5e3c9784fd038bec67311112fab0c61b0d4790896b5efac8a7c81cee83813b91 not found: ID does not exist" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.727169 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.738775 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.745404 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:22:03 crc kubenswrapper[4841]: E1203 17:22:03.745836 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-log" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.745850 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-log" Dec 03 17:22:03 crc kubenswrapper[4841]: E1203 17:22:03.745867 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-metadata" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.745873 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-metadata" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.746076 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-metadata" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.746088 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" containerName="nova-metadata-log" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.747022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.751558 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.753049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.754155 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.779432 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.779478 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-logs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.779540 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-config-data\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.779617 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.779651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47zvv\" (UniqueName: \"kubernetes.io/projected/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-kube-api-access-47zvv\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.880950 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-config-data\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.881068 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.881140 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47zvv\" (UniqueName: \"kubernetes.io/projected/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-kube-api-access-47zvv\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.881266 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.881311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-logs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.886423 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-logs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.887521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-config-data\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.890686 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.900755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:03 crc kubenswrapper[4841]: I1203 17:22:03.911152 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47zvv\" (UniqueName: \"kubernetes.io/projected/a3d1d17d-16e0-4160-93bc-3a926fedbfbd-kube-api-access-47zvv\") pod \"nova-metadata-0\" (UID: \"a3d1d17d-16e0-4160-93bc-3a926fedbfbd\") " pod="openstack/nova-metadata-0" Dec 03 17:22:04 crc kubenswrapper[4841]: I1203 17:22:04.067167 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 17:22:04 crc kubenswrapper[4841]: I1203 17:22:04.262576 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0a3ff45-7f6c-4780-8821-2f46d98d23f9" path="/var/lib/kubelet/pods/f0a3ff45-7f6c-4780-8821-2f46d98d23f9/volumes" Dec 03 17:22:04 crc kubenswrapper[4841]: I1203 17:22:04.589137 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 17:22:04 crc kubenswrapper[4841]: W1203 17:22:04.589204 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3d1d17d_16e0_4160_93bc_3a926fedbfbd.slice/crio-7db45f9e723ae331cbe5abc85f7306e3d0917eacd48e82114adb186f553b6566 WatchSource:0}: Error finding container 7db45f9e723ae331cbe5abc85f7306e3d0917eacd48e82114adb186f553b6566: Status 404 returned error can't find the container with id 7db45f9e723ae331cbe5abc85f7306e3d0917eacd48e82114adb186f553b6566 Dec 03 17:22:04 crc kubenswrapper[4841]: I1203 17:22:04.667668 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d1d17d-16e0-4160-93bc-3a926fedbfbd","Type":"ContainerStarted","Data":"7db45f9e723ae331cbe5abc85f7306e3d0917eacd48e82114adb186f553b6566"} Dec 03 17:22:05 crc kubenswrapper[4841]: I1203 17:22:05.685267 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d1d17d-16e0-4160-93bc-3a926fedbfbd","Type":"ContainerStarted","Data":"c646b1f9456034618155b955715b98f975611b126fc4934ee0d2ae4934b74089"} Dec 03 17:22:05 crc kubenswrapper[4841]: I1203 17:22:05.685634 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a3d1d17d-16e0-4160-93bc-3a926fedbfbd","Type":"ContainerStarted","Data":"80cca492275d650835856678b5776de3adf5f83563e151ad50a42b08c20deb25"} Dec 03 17:22:05 crc kubenswrapper[4841]: I1203 17:22:05.725294 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.725264212 podStartE2EDuration="2.725264212s" podCreationTimestamp="2025-12-03 17:22:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:22:05.713000437 +0000 UTC m=+1320.100521164" watchObservedRunningTime="2025-12-03 17:22:05.725264212 +0000 UTC m=+1320.112784969" Dec 03 17:22:07 crc kubenswrapper[4841]: I1203 17:22:07.368968 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 17:22:09 crc kubenswrapper[4841]: I1203 17:22:09.068028 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:22:09 crc kubenswrapper[4841]: I1203 17:22:09.069317 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 17:22:09 crc kubenswrapper[4841]: I1203 17:22:09.316794 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:22:09 crc kubenswrapper[4841]: I1203 17:22:09.316895 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:22:11 crc kubenswrapper[4841]: I1203 17:22:11.332409 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:22:11 crc kubenswrapper[4841]: I1203 17:22:11.332990 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 17:22:12 crc kubenswrapper[4841]: I1203 17:22:12.347125 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4399b120-7a3b-430f-ad42-21a2c9bd0af5" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:22:12 crc kubenswrapper[4841]: I1203 17:22:12.347230 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4399b120-7a3b-430f-ad42-21a2c9bd0af5" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.208:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:22:12 crc kubenswrapper[4841]: I1203 17:22:12.369607 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 17:22:12 crc kubenswrapper[4841]: I1203 17:22:12.421673 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 17:22:12 crc kubenswrapper[4841]: I1203 17:22:12.868134 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 17:22:14 crc kubenswrapper[4841]: I1203 17:22:14.068169 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 17:22:14 crc kubenswrapper[4841]: I1203 17:22:14.068221 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 17:22:15 crc kubenswrapper[4841]: I1203 17:22:15.082151 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a3d1d17d-16e0-4160-93bc-3a926fedbfbd" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:22:15 crc kubenswrapper[4841]: I1203 17:22:15.082175 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a3d1d17d-16e0-4160-93bc-3a926fedbfbd" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.210:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.341310 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.342529 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.355007 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.355764 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.938576 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 17:22:21 crc kubenswrapper[4841]: I1203 17:22:21.949301 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 17:22:24 crc kubenswrapper[4841]: I1203 17:22:24.076831 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 17:22:24 crc kubenswrapper[4841]: I1203 17:22:24.077428 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 17:22:24 crc kubenswrapper[4841]: I1203 17:22:24.087720 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 17:22:24 crc kubenswrapper[4841]: I1203 17:22:24.089191 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 17:22:24 crc kubenswrapper[4841]: I1203 17:22:24.930844 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 17:22:35 crc kubenswrapper[4841]: I1203 17:22:35.581862 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:36 crc kubenswrapper[4841]: I1203 17:22:36.724064 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:39 crc kubenswrapper[4841]: I1203 17:22:39.316117 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:22:39 crc kubenswrapper[4841]: I1203 17:22:39.316459 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:22:40 crc kubenswrapper[4841]: I1203 17:22:40.347812 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="rabbitmq" containerID="cri-o://d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0" gracePeriod=604796 Dec 03 17:22:42 crc kubenswrapper[4841]: I1203 17:22:42.930119 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="rabbitmq" containerID="cri-o://b093fdb21f484539e2af54030a2f74d5911234561939db70e5925af20e87f3ef" gracePeriod=604794 Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.019235 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056556 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056648 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056687 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056748 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbf9t\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056783 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.056803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.057004 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.057032 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.057057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.057100 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf\") pod \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\" (UID: \"e9e8bee6-ec4a-4743-9ca4-62c37c278958\") " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.059270 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.059813 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.060571 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.068394 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.071412 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "persistence") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.075483 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info" (OuterVolumeSpecName: "pod-info") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.078663 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.083243 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t" (OuterVolumeSpecName: "kube-api-access-wbf9t") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "kube-api-access-wbf9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.159038 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162250 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162365 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162425 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e9e8bee6-ec4a-4743-9ca4-62c37c278958-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162480 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e9e8bee6-ec4a-4743-9ca4-62c37c278958-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162558 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162623 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbf9t\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-kube-api-access-wbf9t\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.162695 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.185730 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data" (OuterVolumeSpecName: "config-data") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.198238 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.199935 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf" (OuterVolumeSpecName: "server-conf") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.231257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e9e8bee6-ec4a-4743-9ca4-62c37c278958" (UID: "e9e8bee6-ec4a-4743-9ca4-62c37c278958"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.263991 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.264035 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.264052 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e9e8bee6-ec4a-4743-9ca4-62c37c278958-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.264063 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e9e8bee6-ec4a-4743-9ca4-62c37c278958-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.534287 4841 generic.go:334] "Generic (PLEG): container finished" podID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerID="d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0" exitCode=0 Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.534337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerDied","Data":"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0"} Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.534363 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e9e8bee6-ec4a-4743-9ca4-62c37c278958","Type":"ContainerDied","Data":"e10de9fe60289ecf1869b4efca723fe3888f9f15595e59300e9c802096f0d8bc"} Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.534380 4841 scope.go:117] "RemoveContainer" containerID="d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.534389 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.569393 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.570365 4841 scope.go:117] "RemoveContainer" containerID="b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.578388 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.597782 4841 scope.go:117] "RemoveContainer" containerID="d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0" Dec 03 17:22:47 crc kubenswrapper[4841]: E1203 17:22:47.598336 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0\": container with ID starting with d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0 not found: ID does not exist" containerID="d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.598398 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0"} err="failed to get container status \"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0\": rpc error: code = NotFound desc = could not find container \"d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0\": container with ID starting with d9248fce35076415b2c1a7bce27d3e90333d3626ba25d759d4ca00ae06ccbba0 not found: ID does not exist" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.598426 4841 scope.go:117] "RemoveContainer" containerID="b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a" Dec 03 17:22:47 crc kubenswrapper[4841]: E1203 17:22:47.598705 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a\": container with ID starting with b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a not found: ID does not exist" containerID="b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.598733 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a"} err="failed to get container status \"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a\": rpc error: code = NotFound desc = could not find container \"b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a\": container with ID starting with b00e9f32dfed66a0eaad554f12af59947fc0cf4d72fb4e0e8ca791631756f46a not found: ID does not exist" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.657582 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:47 crc kubenswrapper[4841]: E1203 17:22:47.660216 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="setup-container" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.660244 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="setup-container" Dec 03 17:22:47 crc kubenswrapper[4841]: E1203 17:22:47.660301 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="rabbitmq" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.660308 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="rabbitmq" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.661758 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" containerName="rabbitmq" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.665329 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.668453 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.672690 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.672691 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.672838 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.673075 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-5j9kb" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.674171 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.674197 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.678609 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782602 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4eda350-169d-4b70-be32-13d2a1ab1aa3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782652 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782680 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4eda350-169d-4b70-be32-13d2a1ab1aa3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782733 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782753 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782806 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782829 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldpl\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-kube-api-access-7ldpl\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782864 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782878 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.782894 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884271 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884315 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldpl\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-kube-api-access-7ldpl\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884346 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884374 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884395 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884447 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4eda350-169d-4b70-be32-13d2a1ab1aa3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884484 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884511 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4eda350-169d-4b70-be32-13d2a1ab1aa3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884609 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.884691 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.885306 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.885493 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.885652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.886241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.886398 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-server-conf\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.886725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b4eda350-169d-4b70-be32-13d2a1ab1aa3-config-data\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.889441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b4eda350-169d-4b70-be32-13d2a1ab1aa3-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.889658 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.890120 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.891601 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b4eda350-169d-4b70-be32-13d2a1ab1aa3-pod-info\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.905451 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldpl\" (UniqueName: \"kubernetes.io/projected/b4eda350-169d-4b70-be32-13d2a1ab1aa3-kube-api-access-7ldpl\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.921392 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"rabbitmq-server-0\" (UID: \"b4eda350-169d-4b70-be32-13d2a1ab1aa3\") " pod="openstack/rabbitmq-server-0" Dec 03 17:22:47 crc kubenswrapper[4841]: I1203 17:22:47.997742 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 17:22:48 crc kubenswrapper[4841]: I1203 17:22:48.254559 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9e8bee6-ec4a-4743-9ca4-62c37c278958" path="/var/lib/kubelet/pods/e9e8bee6-ec4a-4743-9ca4-62c37c278958/volumes" Dec 03 17:22:48 crc kubenswrapper[4841]: I1203 17:22:48.475149 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 17:22:48 crc kubenswrapper[4841]: I1203 17:22:48.550704 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b4eda350-169d-4b70-be32-13d2a1ab1aa3","Type":"ContainerStarted","Data":"1f07532f371c85ee02bb4ba2b51c143d84d457641d55fd5f7c9237f360b7373d"} Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.807165 4841 generic.go:334] "Generic (PLEG): container finished" podID="388e49e3-0d92-49a4-a165-810b7ac67577" containerID="b093fdb21f484539e2af54030a2f74d5911234561939db70e5925af20e87f3ef" exitCode=0 Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.808215 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerDied","Data":"b093fdb21f484539e2af54030a2f74d5911234561939db70e5925af20e87f3ef"} Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.808288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"388e49e3-0d92-49a4-a165-810b7ac67577","Type":"ContainerDied","Data":"bbaac5a23ce473804b5fc66b00c891ee2cd8b581ab87bebe5a52d1ba32fa4e90"} Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.808304 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbaac5a23ce473804b5fc66b00c891ee2cd8b581ab87bebe5a52d1ba32fa4e90" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.823077 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.882786 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95nhj\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.883038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.883097 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.883237 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.883894 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.884085 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.884107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.885799 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.885931 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.886755 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.886803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.886847 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls\") pod \"388e49e3-0d92-49a4-a165-810b7ac67577\" (UID: \"388e49e3-0d92-49a4-a165-810b7ac67577\") " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.887079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.888030 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.888500 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.888545 4841 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.888558 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.891024 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info" (OuterVolumeSpecName: "pod-info") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.929808 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.930442 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj" (OuterVolumeSpecName: "kube-api-access-95nhj") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "kube-api-access-95nhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.933428 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.934589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.954748 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf" (OuterVolumeSpecName: "server-conf") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.990975 4841 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/388e49e3-0d92-49a4-a165-810b7ac67577-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.991239 4841 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.991367 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.991464 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.991552 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95nhj\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-kube-api-access-95nhj\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:49 crc kubenswrapper[4841]: I1203 17:22:49.991642 4841 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/388e49e3-0d92-49a4-a165-810b7ac67577-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.016920 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.093157 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.134227 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:22:50 crc kubenswrapper[4841]: E1203 17:22:50.134721 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="rabbitmq" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.134759 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="rabbitmq" Dec 03 17:22:50 crc kubenswrapper[4841]: E1203 17:22:50.134783 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="setup-container" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.134793 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="setup-container" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.135011 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="rabbitmq" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.135955 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.140601 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.153182 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.194417 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.194489 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn28r\" (UniqueName: \"kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.194534 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.194554 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.195174 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.195290 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.195559 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297305 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297361 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn28r\" (UniqueName: \"kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297420 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297484 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297542 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.297602 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.298745 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.298869 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.299149 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.299205 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.299374 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.299892 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.316523 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn28r\" (UniqueName: \"kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r\") pod \"dnsmasq-dns-7d84b4d45c-tbzvp\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.343395 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data" (OuterVolumeSpecName: "config-data") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.400371 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/388e49e3-0d92-49a4-a165-810b7ac67577-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.635680 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "388e49e3-0d92-49a4-a165-810b7ac67577" (UID: "388e49e3-0d92-49a4-a165-810b7ac67577"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.702655 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.715251 4841 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/388e49e3-0d92-49a4-a165-810b7ac67577-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.816724 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.894150 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.908126 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.919560 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.925241 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.927075 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.928275 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-vx5zq" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.928365 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.928426 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.928681 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.929589 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.929894 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 17:22:50 crc kubenswrapper[4841]: I1203 17:22:50.931507 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.004473 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.019886 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.019958 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dbbe92f9-c159-49ce-90ab-dd67ff712b36-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.019992 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020018 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dbbe92f9-c159-49ce-90ab-dd67ff712b36-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020047 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020078 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020101 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-kube-api-access-q6mpf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020125 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020151 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020197 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.020253 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: W1203 17:22:51.028477 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb65e9c61_b5d7_47ec_b0f9_ed6dcf5fafd0.slice/crio-cec89b1c50298164bc3b6cedd64462eb59e675b0addab755cc998767afdccef0 WatchSource:0}: Error finding container cec89b1c50298164bc3b6cedd64462eb59e675b0addab755cc998767afdccef0: Status 404 returned error can't find the container with id cec89b1c50298164bc3b6cedd64462eb59e675b0addab755cc998767afdccef0 Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.120820 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.121009 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.121879 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dbbe92f9-c159-49ce-90ab-dd67ff712b36-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.121950 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.121980 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dbbe92f9-c159-49ce-90ab-dd67ff712b36-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122006 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122043 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122071 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-kube-api-access-q6mpf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122097 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122305 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122697 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.122967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.123392 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.123598 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dbbe92f9-c159-49ce-90ab-dd67ff712b36-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.123744 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.125459 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dbbe92f9-c159-49ce-90ab-dd67ff712b36-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.125506 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.126393 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.130828 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dbbe92f9-c159-49ce-90ab-dd67ff712b36-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.139468 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6mpf\" (UniqueName: \"kubernetes.io/projected/dbbe92f9-c159-49ce-90ab-dd67ff712b36-kube-api-access-q6mpf\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.186746 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"dbbe92f9-c159-49ce-90ab-dd67ff712b36\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.256978 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.829873 4841 generic.go:334] "Generic (PLEG): container finished" podID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerID="04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a" exitCode=0 Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.829949 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" event={"ID":"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0","Type":"ContainerDied","Data":"04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a"} Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.830253 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" event={"ID":"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0","Type":"ContainerStarted","Data":"cec89b1c50298164bc3b6cedd64462eb59e675b0addab755cc998767afdccef0"} Dec 03 17:22:51 crc kubenswrapper[4841]: I1203 17:22:51.918899 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 17:22:51 crc kubenswrapper[4841]: W1203 17:22:51.935254 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbbe92f9_c159_49ce_90ab_dd67ff712b36.slice/crio-bf70651485ed3057f4f782cf4750df08d1982ddbd1a7ad756daecc75b1c11eb8 WatchSource:0}: Error finding container bf70651485ed3057f4f782cf4750df08d1982ddbd1a7ad756daecc75b1c11eb8: Status 404 returned error can't find the container with id bf70651485ed3057f4f782cf4750df08d1982ddbd1a7ad756daecc75b1c11eb8 Dec 03 17:22:52 crc kubenswrapper[4841]: I1203 17:22:52.253851 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" path="/var/lib/kubelet/pods/388e49e3-0d92-49a4-a165-810b7ac67577/volumes" Dec 03 17:22:52 crc kubenswrapper[4841]: I1203 17:22:52.849177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" event={"ID":"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0","Type":"ContainerStarted","Data":"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4"} Dec 03 17:22:52 crc kubenswrapper[4841]: I1203 17:22:52.849370 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:22:52 crc kubenswrapper[4841]: I1203 17:22:52.851391 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dbbe92f9-c159-49ce-90ab-dd67ff712b36","Type":"ContainerStarted","Data":"bf70651485ed3057f4f782cf4750df08d1982ddbd1a7ad756daecc75b1c11eb8"} Dec 03 17:22:52 crc kubenswrapper[4841]: I1203 17:22:52.889407 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" podStartSLOduration=2.889378204 podStartE2EDuration="2.889378204s" podCreationTimestamp="2025-12-03 17:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:22:52.878611636 +0000 UTC m=+1367.266132403" watchObservedRunningTime="2025-12-03 17:22:52.889378204 +0000 UTC m=+1367.276898961" Dec 03 17:22:53 crc kubenswrapper[4841]: I1203 17:22:53.871618 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dbbe92f9-c159-49ce-90ab-dd67ff712b36","Type":"ContainerStarted","Data":"0d7c883e1b3b8f34216d4a5c019352b3feb265ddd573253d667fe57f3a33296f"} Dec 03 17:22:54 crc kubenswrapper[4841]: I1203 17:22:54.775846 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="388e49e3-0d92-49a4-a165-810b7ac67577" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: i/o timeout" Dec 03 17:22:58 crc kubenswrapper[4841]: I1203 17:22:58.928808 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b4eda350-169d-4b70-be32-13d2a1ab1aa3","Type":"ContainerStarted","Data":"626af222fad75c164e3661d763ac83ebd8c9835c59252094377fa8ccf58f5a49"} Dec 03 17:23:00 crc kubenswrapper[4841]: I1203 17:23:00.704630 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:23:00 crc kubenswrapper[4841]: I1203 17:23:00.803937 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:23:00 crc kubenswrapper[4841]: I1203 17:23:00.804489 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="dnsmasq-dns" containerID="cri-o://2003532e07052a42e74569cc276039097a7e0d21abc8b1e946a39b48361f0990" gracePeriod=10 Dec 03 17:23:00 crc kubenswrapper[4841]: I1203 17:23:00.958154 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerID="2003532e07052a42e74569cc276039097a7e0d21abc8b1e946a39b48361f0990" exitCode=0 Dec 03 17:23:00 crc kubenswrapper[4841]: I1203 17:23:00.958204 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" event={"ID":"9ab11961-383b-4fe3-bdc2-fe78e71617c0","Type":"ContainerDied","Data":"2003532e07052a42e74569cc276039097a7e0d21abc8b1e946a39b48361f0990"} Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.104024 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-r9xqr"] Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.105789 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.126807 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-r9xqr"] Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242158 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242249 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-config\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242332 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242354 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242597 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzzj\" (UniqueName: \"kubernetes.io/projected/42fb8094-c7e2-45f8-932f-e6b868d4cc38-kube-api-access-pvzzj\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.242621 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.343685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.343738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.343812 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-config\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.344082 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.344107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.344141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.344229 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzzj\" (UniqueName: \"kubernetes.io/projected/42fb8094-c7e2-45f8-932f-e6b868d4cc38-kube-api-access-pvzzj\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.345771 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-openstack-edpm-ipam\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.345806 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-svc\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.346131 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-config\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.346525 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-nb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.346752 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-dns-swift-storage-0\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.347202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/42fb8094-c7e2-45f8-932f-e6b868d4cc38-ovsdbserver-sb\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.367653 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzzj\" (UniqueName: \"kubernetes.io/projected/42fb8094-c7e2-45f8-932f-e6b868d4cc38-kube-api-access-pvzzj\") pod \"dnsmasq-dns-6f6df4f56c-r9xqr\" (UID: \"42fb8094-c7e2-45f8-932f-e6b868d4cc38\") " pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.425837 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.453194 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564067 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564133 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564224 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564243 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564367 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.564410 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmt82\" (UniqueName: \"kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82\") pod \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\" (UID: \"9ab11961-383b-4fe3-bdc2-fe78e71617c0\") " Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.577335 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82" (OuterVolumeSpecName: "kube-api-access-mmt82") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "kube-api-access-mmt82". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.625090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.626666 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.639254 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.644545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config" (OuterVolumeSpecName: "config") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.653696 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9ab11961-383b-4fe3-bdc2-fe78e71617c0" (UID: "9ab11961-383b-4fe3-bdc2-fe78e71617c0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666601 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666637 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666652 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666662 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666672 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9ab11961-383b-4fe3-bdc2-fe78e71617c0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.666684 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmt82\" (UniqueName: \"kubernetes.io/projected/9ab11961-383b-4fe3-bdc2-fe78e71617c0-kube-api-access-mmt82\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.951847 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6f6df4f56c-r9xqr"] Dec 03 17:23:01 crc kubenswrapper[4841]: W1203 17:23:01.961110 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42fb8094_c7e2_45f8_932f_e6b868d4cc38.slice/crio-c529b83da6fcb90c5aabc6bab23445b415f76c539289221e9e3c609c7f20f2ac WatchSource:0}: Error finding container c529b83da6fcb90c5aabc6bab23445b415f76c539289221e9e3c609c7f20f2ac: Status 404 returned error can't find the container with id c529b83da6fcb90c5aabc6bab23445b415f76c539289221e9e3c609c7f20f2ac Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.970548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" event={"ID":"9ab11961-383b-4fe3-bdc2-fe78e71617c0","Type":"ContainerDied","Data":"9a505b2359b05d83d0894725a5f78889ac24403722037bd1dca85cae163fefc0"} Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.970618 4841 scope.go:117] "RemoveContainer" containerID="2003532e07052a42e74569cc276039097a7e0d21abc8b1e946a39b48361f0990" Dec 03 17:23:01 crc kubenswrapper[4841]: I1203 17:23:01.970616 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m" Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.128415 4841 scope.go:117] "RemoveContainer" containerID="b18efab15d61ef2a77fc178364f05549dbdcecbb400909bacb200d1a7317f68b" Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.154555 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.166685 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7bbf7cf9-gmd9m"] Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.251725 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" path="/var/lib/kubelet/pods/9ab11961-383b-4fe3-bdc2-fe78e71617c0/volumes" Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.982335 4841 generic.go:334] "Generic (PLEG): container finished" podID="42fb8094-c7e2-45f8-932f-e6b868d4cc38" containerID="0aa342c2b6c9b0adb2ff3b86bea1f5bb4929179b70ca6f3c1135a048ef70a088" exitCode=0 Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.982413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" event={"ID":"42fb8094-c7e2-45f8-932f-e6b868d4cc38","Type":"ContainerDied","Data":"0aa342c2b6c9b0adb2ff3b86bea1f5bb4929179b70ca6f3c1135a048ef70a088"} Dec 03 17:23:02 crc kubenswrapper[4841]: I1203 17:23:02.982448 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" event={"ID":"42fb8094-c7e2-45f8-932f-e6b868d4cc38","Type":"ContainerStarted","Data":"c529b83da6fcb90c5aabc6bab23445b415f76c539289221e9e3c609c7f20f2ac"} Dec 03 17:23:03 crc kubenswrapper[4841]: I1203 17:23:03.995498 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" event={"ID":"42fb8094-c7e2-45f8-932f-e6b868d4cc38","Type":"ContainerStarted","Data":"c4424a0b1abb009f43db02fb495f131113d09662b0862ae260b4aa68819973b0"} Dec 03 17:23:03 crc kubenswrapper[4841]: I1203 17:23:03.995653 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:04 crc kubenswrapper[4841]: I1203 17:23:04.023696 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" podStartSLOduration=3.023675409 podStartE2EDuration="3.023675409s" podCreationTimestamp="2025-12-03 17:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:23:04.022395527 +0000 UTC m=+1378.409916294" watchObservedRunningTime="2025-12-03 17:23:04.023675409 +0000 UTC m=+1378.411196156" Dec 03 17:23:09 crc kubenswrapper[4841]: I1203 17:23:09.316472 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:23:09 crc kubenswrapper[4841]: I1203 17:23:09.317208 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:23:09 crc kubenswrapper[4841]: I1203 17:23:09.317278 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:23:09 crc kubenswrapper[4841]: I1203 17:23:09.318472 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:23:09 crc kubenswrapper[4841]: I1203 17:23:09.318562 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5" gracePeriod=600 Dec 03 17:23:10 crc kubenswrapper[4841]: I1203 17:23:10.060201 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5" exitCode=0 Dec 03 17:23:10 crc kubenswrapper[4841]: I1203 17:23:10.060275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5"} Dec 03 17:23:10 crc kubenswrapper[4841]: I1203 17:23:10.060809 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686"} Dec 03 17:23:10 crc kubenswrapper[4841]: I1203 17:23:10.060830 4841 scope.go:117] "RemoveContainer" containerID="329f3ec52dfd2e9fae18e4f92bbdfa693dee71eed4da1af39ebdcda2381dc16d" Dec 03 17:23:11 crc kubenswrapper[4841]: I1203 17:23:11.457119 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6f6df4f56c-r9xqr" Dec 03 17:23:11 crc kubenswrapper[4841]: I1203 17:23:11.556206 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:23:11 crc kubenswrapper[4841]: I1203 17:23:11.556507 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="dnsmasq-dns" containerID="cri-o://5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4" gracePeriod=10 Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.042475 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057087 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn28r\" (UniqueName: \"kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057232 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057263 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057299 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057345 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.057371 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb\") pod \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\" (UID: \"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0\") " Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.065287 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r" (OuterVolumeSpecName: "kube-api-access-tn28r") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "kube-api-access-tn28r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.090468 4841 generic.go:334] "Generic (PLEG): container finished" podID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerID="5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4" exitCode=0 Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.090530 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" event={"ID":"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0","Type":"ContainerDied","Data":"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4"} Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.090571 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" event={"ID":"b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0","Type":"ContainerDied","Data":"cec89b1c50298164bc3b6cedd64462eb59e675b0addab755cc998767afdccef0"} Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.090599 4841 scope.go:117] "RemoveContainer" containerID="5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.090829 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7d84b4d45c-tbzvp" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.135638 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.136788 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.142060 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.146595 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config" (OuterVolumeSpecName: "config") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.157627 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159012 4841 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159034 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159047 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159056 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159064 4841 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.159074 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn28r\" (UniqueName: \"kubernetes.io/projected/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-kube-api-access-tn28r\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.161545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" (UID: "b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.175765 4841 scope.go:117] "RemoveContainer" containerID="04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.195478 4841 scope.go:117] "RemoveContainer" containerID="5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4" Dec 03 17:23:12 crc kubenswrapper[4841]: E1203 17:23:12.195900 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4\": container with ID starting with 5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4 not found: ID does not exist" containerID="5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.196016 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4"} err="failed to get container status \"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4\": rpc error: code = NotFound desc = could not find container \"5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4\": container with ID starting with 5140b5836b0b4467acf9372ae867d3ff68a79dd8701a3488bb6d663a3e8bfbb4 not found: ID does not exist" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.196109 4841 scope.go:117] "RemoveContainer" containerID="04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a" Dec 03 17:23:12 crc kubenswrapper[4841]: E1203 17:23:12.196522 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a\": container with ID starting with 04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a not found: ID does not exist" containerID="04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.196548 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a"} err="failed to get container status \"04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a\": rpc error: code = NotFound desc = could not find container \"04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a\": container with ID starting with 04a0be65ed60c196f07de7fe942f88751edeef8dc80fcd449e8b7b5311088c1a not found: ID does not exist" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.261104 4841 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.416175 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:23:12 crc kubenswrapper[4841]: I1203 17:23:12.423905 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7d84b4d45c-tbzvp"] Dec 03 17:23:14 crc kubenswrapper[4841]: I1203 17:23:14.252417 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" path="/var/lib/kubelet/pods/b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0/volumes" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.047610 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx"] Dec 03 17:23:20 crc kubenswrapper[4841]: E1203 17:23:20.048722 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.048742 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: E1203 17:23:20.048782 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="init" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.048790 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="init" Dec 03 17:23:20 crc kubenswrapper[4841]: E1203 17:23:20.048809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.048818 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: E1203 17:23:20.048836 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="init" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.048844 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="init" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.049101 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab11961-383b-4fe3-bdc2-fe78e71617c0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.049132 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b65e9c61-b5d7-47ec-b0f9-ed6dcf5fafd0" containerName="dnsmasq-dns" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.049731 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.058595 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx"] Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.059990 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.060130 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.060226 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.060368 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.124480 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.124676 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.124763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2964\" (UniqueName: \"kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.124872 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.226407 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.226520 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.226554 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2964\" (UniqueName: \"kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.226596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.235772 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.235989 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.237078 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.243631 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2964\" (UniqueName: \"kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.377048 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:20 crc kubenswrapper[4841]: W1203 17:23:20.937452 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc02f43bb_1c6c_45cc_bacc_3b8b0dd514a2.slice/crio-c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef WatchSource:0}: Error finding container c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef: Status 404 returned error can't find the container with id c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef Dec 03 17:23:20 crc kubenswrapper[4841]: I1203 17:23:20.938248 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx"] Dec 03 17:23:21 crc kubenswrapper[4841]: I1203 17:23:21.206337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" event={"ID":"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2","Type":"ContainerStarted","Data":"c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef"} Dec 03 17:23:26 crc kubenswrapper[4841]: I1203 17:23:26.269062 4841 generic.go:334] "Generic (PLEG): container finished" podID="dbbe92f9-c159-49ce-90ab-dd67ff712b36" containerID="0d7c883e1b3b8f34216d4a5c019352b3feb265ddd573253d667fe57f3a33296f" exitCode=0 Dec 03 17:23:26 crc kubenswrapper[4841]: I1203 17:23:26.277610 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dbbe92f9-c159-49ce-90ab-dd67ff712b36","Type":"ContainerDied","Data":"0d7c883e1b3b8f34216d4a5c019352b3feb265ddd573253d667fe57f3a33296f"} Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.484929 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.490580 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.509012 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.652320 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbgrs\" (UniqueName: \"kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.652382 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.652517 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.754214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.754316 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbgrs\" (UniqueName: \"kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.754339 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.754774 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.754992 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.777422 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbgrs\" (UniqueName: \"kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs\") pod \"redhat-operators-t6g55\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:30 crc kubenswrapper[4841]: I1203 17:23:30.844360 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.325419 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" event={"ID":"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2","Type":"ContainerStarted","Data":"6cd775f5a94cc90871ba0c4ec847ca78ccb49f5c5691ba21e153c19484543662"} Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.333485 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.336896 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"dbbe92f9-c159-49ce-90ab-dd67ff712b36","Type":"ContainerStarted","Data":"4e20056bd0499572777f8f8235f13b5d1eb00c490b8cb6ee7d89aa44d9559402"} Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.337660 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:23:31 crc kubenswrapper[4841]: W1203 17:23:31.343089 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdb4b1a9_17bf_4d58_a70f_1640a2204efd.slice/crio-baf248ae9bda09f0c19ba2f5501658c1932f912c9f45b4df6fe2f7af2f6240fc WatchSource:0}: Error finding container baf248ae9bda09f0c19ba2f5501658c1932f912c9f45b4df6fe2f7af2f6240fc: Status 404 returned error can't find the container with id baf248ae9bda09f0c19ba2f5501658c1932f912c9f45b4df6fe2f7af2f6240fc Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.343410 4841 generic.go:334] "Generic (PLEG): container finished" podID="b4eda350-169d-4b70-be32-13d2a1ab1aa3" containerID="626af222fad75c164e3661d763ac83ebd8c9835c59252094377fa8ccf58f5a49" exitCode=0 Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.343447 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b4eda350-169d-4b70-be32-13d2a1ab1aa3","Type":"ContainerDied","Data":"626af222fad75c164e3661d763ac83ebd8c9835c59252094377fa8ccf58f5a49"} Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.364720 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" podStartSLOduration=1.52577513 podStartE2EDuration="11.364701084s" podCreationTimestamp="2025-12-03 17:23:20 +0000 UTC" firstStartedPulling="2025-12-03 17:23:20.940898289 +0000 UTC m=+1395.328419036" lastFinishedPulling="2025-12-03 17:23:30.779824263 +0000 UTC m=+1405.167344990" observedRunningTime="2025-12-03 17:23:31.358059049 +0000 UTC m=+1405.745579776" watchObservedRunningTime="2025-12-03 17:23:31.364701084 +0000 UTC m=+1405.752221821" Dec 03 17:23:31 crc kubenswrapper[4841]: I1203 17:23:31.382811 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.382796233 podStartE2EDuration="41.382796233s" podCreationTimestamp="2025-12-03 17:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:23:31.382150257 +0000 UTC m=+1405.769670984" watchObservedRunningTime="2025-12-03 17:23:31.382796233 +0000 UTC m=+1405.770316960" Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.354122 4841 generic.go:334] "Generic (PLEG): container finished" podID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerID="e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650" exitCode=0 Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.354326 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerDied","Data":"e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650"} Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.354585 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerStarted","Data":"baf248ae9bda09f0c19ba2f5501658c1932f912c9f45b4df6fe2f7af2f6240fc"} Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.359284 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"b4eda350-169d-4b70-be32-13d2a1ab1aa3","Type":"ContainerStarted","Data":"a56d158dbbcd0e0efa127d779938383269e73b694e0a27344232d5a7f8e70beb"} Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.359712 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 17:23:32 crc kubenswrapper[4841]: I1203 17:23:32.406969 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=45.406951 podStartE2EDuration="45.406951s" podCreationTimestamp="2025-12-03 17:22:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:23:32.400015658 +0000 UTC m=+1406.787536385" watchObservedRunningTime="2025-12-03 17:23:32.406951 +0000 UTC m=+1406.794471727" Dec 03 17:23:33 crc kubenswrapper[4841]: I1203 17:23:33.394664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerStarted","Data":"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb"} Dec 03 17:23:36 crc kubenswrapper[4841]: I1203 17:23:36.423948 4841 generic.go:334] "Generic (PLEG): container finished" podID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerID="da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb" exitCode=0 Dec 03 17:23:36 crc kubenswrapper[4841]: I1203 17:23:36.424500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerDied","Data":"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb"} Dec 03 17:23:39 crc kubenswrapper[4841]: I1203 17:23:39.451346 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerStarted","Data":"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10"} Dec 03 17:23:39 crc kubenswrapper[4841]: I1203 17:23:39.476305 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t6g55" podStartSLOduration=3.821949492 podStartE2EDuration="9.476290003s" podCreationTimestamp="2025-12-03 17:23:30 +0000 UTC" firstStartedPulling="2025-12-03 17:23:32.356371895 +0000 UTC m=+1406.743892622" lastFinishedPulling="2025-12-03 17:23:38.010712396 +0000 UTC m=+1412.398233133" observedRunningTime="2025-12-03 17:23:39.470361356 +0000 UTC m=+1413.857882083" watchObservedRunningTime="2025-12-03 17:23:39.476290003 +0000 UTC m=+1413.863810720" Dec 03 17:23:40 crc kubenswrapper[4841]: I1203 17:23:40.845103 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:40 crc kubenswrapper[4841]: I1203 17:23:40.846786 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:41 crc kubenswrapper[4841]: I1203 17:23:41.260715 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="dbbe92f9-c159-49ce-90ab-dd67ff712b36" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.213:5671: connect: connection refused" Dec 03 17:23:41 crc kubenswrapper[4841]: I1203 17:23:41.924018 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t6g55" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="registry-server" probeResult="failure" output=< Dec 03 17:23:41 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:23:41 crc kubenswrapper[4841]: > Dec 03 17:23:43 crc kubenswrapper[4841]: I1203 17:23:43.489302 4841 generic.go:334] "Generic (PLEG): container finished" podID="c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" containerID="6cd775f5a94cc90871ba0c4ec847ca78ccb49f5c5691ba21e153c19484543662" exitCode=0 Dec 03 17:23:43 crc kubenswrapper[4841]: I1203 17:23:43.489393 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" event={"ID":"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2","Type":"ContainerDied","Data":"6cd775f5a94cc90871ba0c4ec847ca78ccb49f5c5691ba21e153c19484543662"} Dec 03 17:23:44 crc kubenswrapper[4841]: I1203 17:23:44.956433 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.060216 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key\") pod \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.060280 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory\") pod \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.060331 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle\") pod \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.060445 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2964\" (UniqueName: \"kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964\") pod \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\" (UID: \"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2\") " Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.067830 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964" (OuterVolumeSpecName: "kube-api-access-g2964") pod "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" (UID: "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2"). InnerVolumeSpecName "kube-api-access-g2964". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.068193 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" (UID: "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.105038 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory" (OuterVolumeSpecName: "inventory") pod "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" (UID: "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.116116 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" (UID: "c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.163574 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2964\" (UniqueName: \"kubernetes.io/projected/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-kube-api-access-g2964\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.163603 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.163659 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.163671 4841 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.520027 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" event={"ID":"c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2","Type":"ContainerDied","Data":"c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef"} Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.520076 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c304fe99ba0d2e70b3d819acfe4dc5b4c01c6722b1822a75b3c665819bc398ef" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.520121 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.636337 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g"] Dec 03 17:23:45 crc kubenswrapper[4841]: E1203 17:23:45.636994 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.637025 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.637291 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.638123 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.649271 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.649572 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.649716 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.649798 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.654351 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g"] Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.775702 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.775888 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.776038 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmxjf\" (UniqueName: \"kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.878570 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmxjf\" (UniqueName: \"kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.878640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.878761 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.883188 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.895783 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.905397 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmxjf\" (UniqueName: \"kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-q7w8g\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:45 crc kubenswrapper[4841]: I1203 17:23:45.958548 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:46 crc kubenswrapper[4841]: I1203 17:23:46.514591 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g"] Dec 03 17:23:47 crc kubenswrapper[4841]: I1203 17:23:47.539251 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" event={"ID":"b310e506-2bc4-400e-acd0-749838969d1c","Type":"ContainerStarted","Data":"8771052ef634487c6e1b8bc574d5e247b79ea76dcd3fefe0c4a4b8a56577a864"} Dec 03 17:23:47 crc kubenswrapper[4841]: I1203 17:23:47.539568 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" event={"ID":"b310e506-2bc4-400e-acd0-749838969d1c","Type":"ContainerStarted","Data":"2345af9ecc450d4406595a095d105226910d887c1fd9f32e517768e097f960b8"} Dec 03 17:23:47 crc kubenswrapper[4841]: I1203 17:23:47.558140 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" podStartSLOduration=2.071289978 podStartE2EDuration="2.558120804s" podCreationTimestamp="2025-12-03 17:23:45 +0000 UTC" firstStartedPulling="2025-12-03 17:23:46.529987788 +0000 UTC m=+1420.917508515" lastFinishedPulling="2025-12-03 17:23:47.016818614 +0000 UTC m=+1421.404339341" observedRunningTime="2025-12-03 17:23:47.553012857 +0000 UTC m=+1421.940533584" watchObservedRunningTime="2025-12-03 17:23:47.558120804 +0000 UTC m=+1421.945641531" Dec 03 17:23:48 crc kubenswrapper[4841]: I1203 17:23:48.000189 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 17:23:50 crc kubenswrapper[4841]: I1203 17:23:50.572749 4841 generic.go:334] "Generic (PLEG): container finished" podID="b310e506-2bc4-400e-acd0-749838969d1c" containerID="8771052ef634487c6e1b8bc574d5e247b79ea76dcd3fefe0c4a4b8a56577a864" exitCode=0 Dec 03 17:23:50 crc kubenswrapper[4841]: I1203 17:23:50.572986 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" event={"ID":"b310e506-2bc4-400e-acd0-749838969d1c","Type":"ContainerDied","Data":"8771052ef634487c6e1b8bc574d5e247b79ea76dcd3fefe0c4a4b8a56577a864"} Dec 03 17:23:50 crc kubenswrapper[4841]: I1203 17:23:50.923461 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:51 crc kubenswrapper[4841]: I1203 17:23:51.006913 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:51 crc kubenswrapper[4841]: I1203 17:23:51.189870 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:51 crc kubenswrapper[4841]: I1203 17:23:51.260209 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.189363 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.346238 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory\") pod \"b310e506-2bc4-400e-acd0-749838969d1c\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.346453 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmxjf\" (UniqueName: \"kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf\") pod \"b310e506-2bc4-400e-acd0-749838969d1c\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.346537 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key\") pod \"b310e506-2bc4-400e-acd0-749838969d1c\" (UID: \"b310e506-2bc4-400e-acd0-749838969d1c\") " Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.353502 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf" (OuterVolumeSpecName: "kube-api-access-xmxjf") pod "b310e506-2bc4-400e-acd0-749838969d1c" (UID: "b310e506-2bc4-400e-acd0-749838969d1c"). InnerVolumeSpecName "kube-api-access-xmxjf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.380049 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory" (OuterVolumeSpecName: "inventory") pod "b310e506-2bc4-400e-acd0-749838969d1c" (UID: "b310e506-2bc4-400e-acd0-749838969d1c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.405899 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b310e506-2bc4-400e-acd0-749838969d1c" (UID: "b310e506-2bc4-400e-acd0-749838969d1c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.450318 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmxjf\" (UniqueName: \"kubernetes.io/projected/b310e506-2bc4-400e-acd0-749838969d1c-kube-api-access-xmxjf\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.450428 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.450515 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b310e506-2bc4-400e-acd0-749838969d1c-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.595370 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t6g55" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="registry-server" containerID="cri-o://46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10" gracePeriod=2 Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.595527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" event={"ID":"b310e506-2bc4-400e-acd0-749838969d1c","Type":"ContainerDied","Data":"2345af9ecc450d4406595a095d105226910d887c1fd9f32e517768e097f960b8"} Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.595582 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2345af9ecc450d4406595a095d105226910d887c1fd9f32e517768e097f960b8" Dec 03 17:23:52 crc kubenswrapper[4841]: I1203 17:23:52.595672 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-q7w8g" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.879625 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc"] Dec 03 17:23:53 crc kubenswrapper[4841]: E1203 17:23:52.880023 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b310e506-2bc4-400e-acd0-749838969d1c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.880035 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b310e506-2bc4-400e-acd0-749838969d1c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.880216 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b310e506-2bc4-400e-acd0-749838969d1c" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.881507 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.886618 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.886661 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.886630 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.886845 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.898286 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc"] Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:52.989134 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.067004 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.067079 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.067289 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsjk2\" (UniqueName: \"kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.067429 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.170041 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jbgrs\" (UniqueName: \"kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs\") pod \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.170209 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities\") pod \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.170410 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content\") pod \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\" (UID: \"bdb4b1a9-17bf-4d58-a70f-1640a2204efd\") " Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.171164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.171244 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.171311 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsjk2\" (UniqueName: \"kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.171377 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.171871 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities" (OuterVolumeSpecName: "utilities") pod "bdb4b1a9-17bf-4d58-a70f-1640a2204efd" (UID: "bdb4b1a9-17bf-4d58-a70f-1640a2204efd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.177269 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs" (OuterVolumeSpecName: "kube-api-access-jbgrs") pod "bdb4b1a9-17bf-4d58-a70f-1640a2204efd" (UID: "bdb4b1a9-17bf-4d58-a70f-1640a2204efd"). InnerVolumeSpecName "kube-api-access-jbgrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.178724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.177150 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.185967 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.192889 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsjk2\" (UniqueName: \"kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.212810 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.274243 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jbgrs\" (UniqueName: \"kubernetes.io/projected/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-kube-api-access-jbgrs\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.274272 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.303584 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bdb4b1a9-17bf-4d58-a70f-1640a2204efd" (UID: "bdb4b1a9-17bf-4d58-a70f-1640a2204efd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.377566 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bdb4b1a9-17bf-4d58-a70f-1640a2204efd-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.606519 4841 generic.go:334] "Generic (PLEG): container finished" podID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerID="46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10" exitCode=0 Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.606562 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerDied","Data":"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10"} Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.606590 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t6g55" event={"ID":"bdb4b1a9-17bf-4d58-a70f-1640a2204efd","Type":"ContainerDied","Data":"baf248ae9bda09f0c19ba2f5501658c1932f912c9f45b4df6fe2f7af2f6240fc"} Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.606607 4841 scope.go:117] "RemoveContainer" containerID="46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.606734 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t6g55" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.641549 4841 scope.go:117] "RemoveContainer" containerID="da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.647305 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.661371 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t6g55"] Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.680065 4841 scope.go:117] "RemoveContainer" containerID="e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.727472 4841 scope.go:117] "RemoveContainer" containerID="46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10" Dec 03 17:23:53 crc kubenswrapper[4841]: E1203 17:23:53.728021 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10\": container with ID starting with 46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10 not found: ID does not exist" containerID="46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.728052 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10"} err="failed to get container status \"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10\": rpc error: code = NotFound desc = could not find container \"46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10\": container with ID starting with 46060382f97599e1230652ac9b76c7d219c399f118337053cea310b77e039d10 not found: ID does not exist" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.728072 4841 scope.go:117] "RemoveContainer" containerID="da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb" Dec 03 17:23:53 crc kubenswrapper[4841]: E1203 17:23:53.728640 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb\": container with ID starting with da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb not found: ID does not exist" containerID="da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.728663 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb"} err="failed to get container status \"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb\": rpc error: code = NotFound desc = could not find container \"da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb\": container with ID starting with da47a5a6dd63b0ea465b3a51cca739a15735e19bd52f642c465aa49c5f7c36eb not found: ID does not exist" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.728676 4841 scope.go:117] "RemoveContainer" containerID="e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650" Dec 03 17:23:53 crc kubenswrapper[4841]: E1203 17:23:53.729803 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650\": container with ID starting with e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650 not found: ID does not exist" containerID="e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650" Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.729819 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650"} err="failed to get container status \"e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650\": rpc error: code = NotFound desc = could not find container \"e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650\": container with ID starting with e3220c0434761198676f147aee0ecb3a9ef098be675b8ea3ae73234983e5d650 not found: ID does not exist" Dec 03 17:23:53 crc kubenswrapper[4841]: W1203 17:23:53.854284 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod15e8ed9f_b5ae_44bb_b295_1222cdad5513.slice/crio-3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d WatchSource:0}: Error finding container 3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d: Status 404 returned error can't find the container with id 3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d Dec 03 17:23:53 crc kubenswrapper[4841]: I1203 17:23:53.860436 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc"] Dec 03 17:23:54 crc kubenswrapper[4841]: I1203 17:23:54.262096 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" path="/var/lib/kubelet/pods/bdb4b1a9-17bf-4d58-a70f-1640a2204efd/volumes" Dec 03 17:23:54 crc kubenswrapper[4841]: I1203 17:23:54.617945 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" event={"ID":"15e8ed9f-b5ae-44bb-b295-1222cdad5513","Type":"ContainerStarted","Data":"3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d"} Dec 03 17:23:55 crc kubenswrapper[4841]: I1203 17:23:55.633552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" event={"ID":"15e8ed9f-b5ae-44bb-b295-1222cdad5513","Type":"ContainerStarted","Data":"8854e3e0dda2044b21069122e0e14340545a0ee16d5bd4afa4e48349e73efd2a"} Dec 03 17:23:55 crc kubenswrapper[4841]: I1203 17:23:55.661519 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" podStartSLOduration=3.190501097 podStartE2EDuration="3.661495269s" podCreationTimestamp="2025-12-03 17:23:52 +0000 UTC" firstStartedPulling="2025-12-03 17:23:53.856593582 +0000 UTC m=+1428.244114309" lastFinishedPulling="2025-12-03 17:23:54.327587744 +0000 UTC m=+1428.715108481" observedRunningTime="2025-12-03 17:23:55.658352198 +0000 UTC m=+1430.045872995" watchObservedRunningTime="2025-12-03 17:23:55.661495269 +0000 UTC m=+1430.049016036" Dec 03 17:24:19 crc kubenswrapper[4841]: I1203 17:24:19.970013 4841 scope.go:117] "RemoveContainer" containerID="b093fdb21f484539e2af54030a2f74d5911234561939db70e5925af20e87f3ef" Dec 03 17:24:20 crc kubenswrapper[4841]: I1203 17:24:20.008204 4841 scope.go:117] "RemoveContainer" containerID="f1f3d439d70a1fafa84a9ab6f3659e96689557ef080da416ec8469461dc0733b" Dec 03 17:24:20 crc kubenswrapper[4841]: I1203 17:24:20.062562 4841 scope.go:117] "RemoveContainer" containerID="e648f4c2c0cd4cbfa576aed1bd3ce958793bcb6f8ef0a320a51436843de27c16" Dec 03 17:25:09 crc kubenswrapper[4841]: I1203 17:25:09.316847 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:25:09 crc kubenswrapper[4841]: I1203 17:25:09.317758 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:25:20 crc kubenswrapper[4841]: I1203 17:25:20.225164 4841 scope.go:117] "RemoveContainer" containerID="4124a709b9468d60618d0a00013f2e8302ae03033ce52df7b44443775fcb92f3" Dec 03 17:25:39 crc kubenswrapper[4841]: I1203 17:25:39.316507 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:25:39 crc kubenswrapper[4841]: I1203 17:25:39.317293 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:26:09 crc kubenswrapper[4841]: I1203 17:26:09.317296 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:26:09 crc kubenswrapper[4841]: I1203 17:26:09.318841 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:26:09 crc kubenswrapper[4841]: I1203 17:26:09.319018 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:26:09 crc kubenswrapper[4841]: I1203 17:26:09.320310 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:26:09 crc kubenswrapper[4841]: I1203 17:26:09.320433 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" gracePeriod=600 Dec 03 17:26:09 crc kubenswrapper[4841]: E1203 17:26:09.452864 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:26:10 crc kubenswrapper[4841]: I1203 17:26:10.354592 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" exitCode=0 Dec 03 17:26:10 crc kubenswrapper[4841]: I1203 17:26:10.354704 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686"} Dec 03 17:26:10 crc kubenswrapper[4841]: I1203 17:26:10.354986 4841 scope.go:117] "RemoveContainer" containerID="ba0a1c8798769f6bd460530e7ac2c690f029124a02f84950fbf6164264b3c8a5" Dec 03 17:26:10 crc kubenswrapper[4841]: I1203 17:26:10.355987 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:26:10 crc kubenswrapper[4841]: E1203 17:26:10.356523 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:26:22 crc kubenswrapper[4841]: I1203 17:26:22.239656 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:26:22 crc kubenswrapper[4841]: E1203 17:26:22.240721 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:26:35 crc kubenswrapper[4841]: I1203 17:26:35.239544 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:26:35 crc kubenswrapper[4841]: E1203 17:26:35.240779 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.390531 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:26:47 crc kubenswrapper[4841]: E1203 17:26:47.391674 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="registry-server" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.391690 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="registry-server" Dec 03 17:26:47 crc kubenswrapper[4841]: E1203 17:26:47.391703 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="extract-content" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.391711 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="extract-content" Dec 03 17:26:47 crc kubenswrapper[4841]: E1203 17:26:47.391728 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="extract-utilities" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.391735 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="extract-utilities" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.391977 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdb4b1a9-17bf-4d58-a70f-1640a2204efd" containerName="registry-server" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.409051 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.423745 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.478712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.478815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cczm\" (UniqueName: \"kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.478842 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.580096 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cczm\" (UniqueName: \"kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.580154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.580258 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.580712 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.580729 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.610465 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cczm\" (UniqueName: \"kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm\") pod \"certified-operators-g665c\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:47 crc kubenswrapper[4841]: I1203 17:26:47.736620 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:48 crc kubenswrapper[4841]: I1203 17:26:48.267209 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:26:48 crc kubenswrapper[4841]: I1203 17:26:48.789306 4841 generic.go:334] "Generic (PLEG): container finished" podID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerID="7b7761801f93ef99905813509bf38d0b63d42a9df0f77ed8b125275b15bfcaa2" exitCode=0 Dec 03 17:26:48 crc kubenswrapper[4841]: I1203 17:26:48.789395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerDied","Data":"7b7761801f93ef99905813509bf38d0b63d42a9df0f77ed8b125275b15bfcaa2"} Dec 03 17:26:48 crc kubenswrapper[4841]: I1203 17:26:48.789440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerStarted","Data":"7ab444bf4264f35d8c2548128163834820f423c69b6cf265355db31bd5315798"} Dec 03 17:26:48 crc kubenswrapper[4841]: I1203 17:26:48.792624 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:26:50 crc kubenswrapper[4841]: I1203 17:26:50.239812 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:26:50 crc kubenswrapper[4841]: E1203 17:26:50.240563 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:26:50 crc kubenswrapper[4841]: I1203 17:26:50.811808 4841 generic.go:334] "Generic (PLEG): container finished" podID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerID="344efdcabe0c8c51c887c8f1b668564c7ac2bc0604e9d4e6e57607e7c78c9491" exitCode=0 Dec 03 17:26:50 crc kubenswrapper[4841]: I1203 17:26:50.811867 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerDied","Data":"344efdcabe0c8c51c887c8f1b668564c7ac2bc0604e9d4e6e57607e7c78c9491"} Dec 03 17:26:51 crc kubenswrapper[4841]: I1203 17:26:51.827227 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerStarted","Data":"c532a6e64e3e3efaf531d1e1c3c551a5162abffe07d561ff6f5a45e392297db3"} Dec 03 17:26:51 crc kubenswrapper[4841]: I1203 17:26:51.863539 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g665c" podStartSLOduration=2.238534115 podStartE2EDuration="4.86351399s" podCreationTimestamp="2025-12-03 17:26:47 +0000 UTC" firstStartedPulling="2025-12-03 17:26:48.792250174 +0000 UTC m=+1603.179770941" lastFinishedPulling="2025-12-03 17:26:51.417230079 +0000 UTC m=+1605.804750816" observedRunningTime="2025-12-03 17:26:51.850592824 +0000 UTC m=+1606.238113571" watchObservedRunningTime="2025-12-03 17:26:51.86351399 +0000 UTC m=+1606.251034737" Dec 03 17:26:57 crc kubenswrapper[4841]: I1203 17:26:57.737334 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:57 crc kubenswrapper[4841]: I1203 17:26:57.738702 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:57 crc kubenswrapper[4841]: I1203 17:26:57.789180 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:57 crc kubenswrapper[4841]: I1203 17:26:57.979176 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:26:58 crc kubenswrapper[4841]: I1203 17:26:58.037144 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:26:59 crc kubenswrapper[4841]: I1203 17:26:59.915482 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g665c" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="registry-server" containerID="cri-o://c532a6e64e3e3efaf531d1e1c3c551a5162abffe07d561ff6f5a45e392297db3" gracePeriod=2 Dec 03 17:27:00 crc kubenswrapper[4841]: I1203 17:27:00.938609 4841 generic.go:334] "Generic (PLEG): container finished" podID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerID="c532a6e64e3e3efaf531d1e1c3c551a5162abffe07d561ff6f5a45e392297db3" exitCode=0 Dec 03 17:27:00 crc kubenswrapper[4841]: I1203 17:27:00.938716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerDied","Data":"c532a6e64e3e3efaf531d1e1c3c551a5162abffe07d561ff6f5a45e392297db3"} Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.049835 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.119989 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities\") pod \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.120360 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cczm\" (UniqueName: \"kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm\") pod \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.120441 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content\") pod \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\" (UID: \"87dfbcff-7e9a-44bd-adb3-8079886da7aa\") " Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.123671 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities" (OuterVolumeSpecName: "utilities") pod "87dfbcff-7e9a-44bd-adb3-8079886da7aa" (UID: "87dfbcff-7e9a-44bd-adb3-8079886da7aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.145204 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm" (OuterVolumeSpecName: "kube-api-access-9cczm") pod "87dfbcff-7e9a-44bd-adb3-8079886da7aa" (UID: "87dfbcff-7e9a-44bd-adb3-8079886da7aa"). InnerVolumeSpecName "kube-api-access-9cczm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.223625 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.223665 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cczm\" (UniqueName: \"kubernetes.io/projected/87dfbcff-7e9a-44bd-adb3-8079886da7aa-kube-api-access-9cczm\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.631880 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "87dfbcff-7e9a-44bd-adb3-8079886da7aa" (UID: "87dfbcff-7e9a-44bd-adb3-8079886da7aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.732712 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/87dfbcff-7e9a-44bd-adb3-8079886da7aa-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.953178 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g665c" event={"ID":"87dfbcff-7e9a-44bd-adb3-8079886da7aa","Type":"ContainerDied","Data":"7ab444bf4264f35d8c2548128163834820f423c69b6cf265355db31bd5315798"} Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.953247 4841 scope.go:117] "RemoveContainer" containerID="c532a6e64e3e3efaf531d1e1c3c551a5162abffe07d561ff6f5a45e392297db3" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.953288 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g665c" Dec 03 17:27:01 crc kubenswrapper[4841]: I1203 17:27:01.999326 4841 scope.go:117] "RemoveContainer" containerID="344efdcabe0c8c51c887c8f1b668564c7ac2bc0604e9d4e6e57607e7c78c9491" Dec 03 17:27:02 crc kubenswrapper[4841]: I1203 17:27:02.002030 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:27:02 crc kubenswrapper[4841]: I1203 17:27:02.015674 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g665c"] Dec 03 17:27:02 crc kubenswrapper[4841]: I1203 17:27:02.033507 4841 scope.go:117] "RemoveContainer" containerID="7b7761801f93ef99905813509bf38d0b63d42a9df0f77ed8b125275b15bfcaa2" Dec 03 17:27:02 crc kubenswrapper[4841]: I1203 17:27:02.251566 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" path="/var/lib/kubelet/pods/87dfbcff-7e9a-44bd-adb3-8079886da7aa/volumes" Dec 03 17:27:04 crc kubenswrapper[4841]: I1203 17:27:04.239297 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:27:04 crc kubenswrapper[4841]: E1203 17:27:04.240182 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:27:12 crc kubenswrapper[4841]: I1203 17:27:12.103259 4841 generic.go:334] "Generic (PLEG): container finished" podID="15e8ed9f-b5ae-44bb-b295-1222cdad5513" containerID="8854e3e0dda2044b21069122e0e14340545a0ee16d5bd4afa4e48349e73efd2a" exitCode=0 Dec 03 17:27:12 crc kubenswrapper[4841]: I1203 17:27:12.103357 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" event={"ID":"15e8ed9f-b5ae-44bb-b295-1222cdad5513","Type":"ContainerDied","Data":"8854e3e0dda2044b21069122e0e14340545a0ee16d5bd4afa4e48349e73efd2a"} Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.575815 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.681303 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsjk2\" (UniqueName: \"kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2\") pod \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.681412 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key\") pod \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.681481 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle\") pod \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.681526 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory\") pod \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\" (UID: \"15e8ed9f-b5ae-44bb-b295-1222cdad5513\") " Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.686589 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "15e8ed9f-b5ae-44bb-b295-1222cdad5513" (UID: "15e8ed9f-b5ae-44bb-b295-1222cdad5513"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.688832 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2" (OuterVolumeSpecName: "kube-api-access-hsjk2") pod "15e8ed9f-b5ae-44bb-b295-1222cdad5513" (UID: "15e8ed9f-b5ae-44bb-b295-1222cdad5513"). InnerVolumeSpecName "kube-api-access-hsjk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.710878 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory" (OuterVolumeSpecName: "inventory") pod "15e8ed9f-b5ae-44bb-b295-1222cdad5513" (UID: "15e8ed9f-b5ae-44bb-b295-1222cdad5513"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.728455 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "15e8ed9f-b5ae-44bb-b295-1222cdad5513" (UID: "15e8ed9f-b5ae-44bb-b295-1222cdad5513"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.783218 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsjk2\" (UniqueName: \"kubernetes.io/projected/15e8ed9f-b5ae-44bb-b295-1222cdad5513-kube-api-access-hsjk2\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.783248 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.783257 4841 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:13 crc kubenswrapper[4841]: I1203 17:27:13.783267 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/15e8ed9f-b5ae-44bb-b295-1222cdad5513-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.133697 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" event={"ID":"15e8ed9f-b5ae-44bb-b295-1222cdad5513","Type":"ContainerDied","Data":"3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d"} Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.133750 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3710dc4729287245a0d8fdf1470b65674f0bb1e3fa56c601bed602cb278a920d" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.133849 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.279173 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z"] Dec 03 17:27:14 crc kubenswrapper[4841]: E1203 17:27:14.279857 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="extract-utilities" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.279987 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="extract-utilities" Dec 03 17:27:14 crc kubenswrapper[4841]: E1203 17:27:14.280059 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="extract-content" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.280111 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="extract-content" Dec 03 17:27:14 crc kubenswrapper[4841]: E1203 17:27:14.280175 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="registry-server" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.280226 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="registry-server" Dec 03 17:27:14 crc kubenswrapper[4841]: E1203 17:27:14.280291 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15e8ed9f-b5ae-44bb-b295-1222cdad5513" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.280348 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="15e8ed9f-b5ae-44bb-b295-1222cdad5513" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.280625 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="87dfbcff-7e9a-44bd-adb3-8079886da7aa" containerName="registry-server" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.280698 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="15e8ed9f-b5ae-44bb-b295-1222cdad5513" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.281577 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.284871 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.285201 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.285649 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.286268 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.301870 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z"] Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.310661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.310762 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.310820 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vc4c\" (UniqueName: \"kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.413261 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.413755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vc4c\" (UniqueName: \"kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.414194 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.423419 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.425227 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.440549 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vc4c\" (UniqueName: \"kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-sx49z\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:14 crc kubenswrapper[4841]: I1203 17:27:14.607210 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:27:15 crc kubenswrapper[4841]: I1203 17:27:15.162680 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z"] Dec 03 17:27:16 crc kubenswrapper[4841]: I1203 17:27:16.152153 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" event={"ID":"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73","Type":"ContainerStarted","Data":"48df4b5b50b25eee51825c2ee068acfb0ddd6609c07bfb8eb66ae34ffa25e191"} Dec 03 17:27:16 crc kubenswrapper[4841]: I1203 17:27:16.153502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" event={"ID":"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73","Type":"ContainerStarted","Data":"081de971c39b8839fc634d7b61cf9abb62e23e1699d5d6bd66e23a31ea967990"} Dec 03 17:27:16 crc kubenswrapper[4841]: I1203 17:27:16.180264 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" podStartSLOduration=1.7154506280000001 podStartE2EDuration="2.180232879s" podCreationTimestamp="2025-12-03 17:27:14 +0000 UTC" firstStartedPulling="2025-12-03 17:27:15.165535418 +0000 UTC m=+1629.553056145" lastFinishedPulling="2025-12-03 17:27:15.630317669 +0000 UTC m=+1630.017838396" observedRunningTime="2025-12-03 17:27:16.179187543 +0000 UTC m=+1630.566708280" watchObservedRunningTime="2025-12-03 17:27:16.180232879 +0000 UTC m=+1630.567753646" Dec 03 17:27:19 crc kubenswrapper[4841]: I1203 17:27:19.239582 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:27:19 crc kubenswrapper[4841]: E1203 17:27:19.240139 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.131487 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.135803 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.149049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.258528 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.258816 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88bsj\" (UniqueName: \"kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.259703 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.361192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.361295 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88bsj\" (UniqueName: \"kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.361363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.361777 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.362299 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.387068 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88bsj\" (UniqueName: \"kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj\") pod \"redhat-marketplace-w6vj8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.464079 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:25 crc kubenswrapper[4841]: I1203 17:27:25.966578 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:25 crc kubenswrapper[4841]: W1203 17:27:25.968219 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc130084e_a303_499f_82b4_881a66c923f8.slice/crio-11dffc6c33188af1e375c74a1244da3c946d032fbd6f02f9a6f393bf7cc311d8 WatchSource:0}: Error finding container 11dffc6c33188af1e375c74a1244da3c946d032fbd6f02f9a6f393bf7cc311d8: Status 404 returned error can't find the container with id 11dffc6c33188af1e375c74a1244da3c946d032fbd6f02f9a6f393bf7cc311d8 Dec 03 17:27:26 crc kubenswrapper[4841]: I1203 17:27:26.273567 4841 generic.go:334] "Generic (PLEG): container finished" podID="c130084e-a303-499f-82b4-881a66c923f8" containerID="550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705" exitCode=0 Dec 03 17:27:26 crc kubenswrapper[4841]: I1203 17:27:26.273714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerDied","Data":"550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705"} Dec 03 17:27:26 crc kubenswrapper[4841]: I1203 17:27:26.274898 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerStarted","Data":"11dffc6c33188af1e375c74a1244da3c946d032fbd6f02f9a6f393bf7cc311d8"} Dec 03 17:27:28 crc kubenswrapper[4841]: I1203 17:27:28.305176 4841 generic.go:334] "Generic (PLEG): container finished" podID="c130084e-a303-499f-82b4-881a66c923f8" containerID="ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3" exitCode=0 Dec 03 17:27:28 crc kubenswrapper[4841]: I1203 17:27:28.305212 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerDied","Data":"ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3"} Dec 03 17:27:30 crc kubenswrapper[4841]: I1203 17:27:30.330839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerStarted","Data":"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980"} Dec 03 17:27:30 crc kubenswrapper[4841]: I1203 17:27:30.356275 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w6vj8" podStartSLOduration=2.091484439 podStartE2EDuration="5.356255268s" podCreationTimestamp="2025-12-03 17:27:25 +0000 UTC" firstStartedPulling="2025-12-03 17:27:26.278177723 +0000 UTC m=+1640.665698450" lastFinishedPulling="2025-12-03 17:27:29.542948512 +0000 UTC m=+1643.930469279" observedRunningTime="2025-12-03 17:27:30.350363724 +0000 UTC m=+1644.737884461" watchObservedRunningTime="2025-12-03 17:27:30.356255268 +0000 UTC m=+1644.743776005" Dec 03 17:27:33 crc kubenswrapper[4841]: I1203 17:27:33.239939 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:27:33 crc kubenswrapper[4841]: E1203 17:27:33.240745 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:27:35 crc kubenswrapper[4841]: I1203 17:27:35.464537 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:35 crc kubenswrapper[4841]: I1203 17:27:35.464618 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:35 crc kubenswrapper[4841]: I1203 17:27:35.560632 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:36 crc kubenswrapper[4841]: I1203 17:27:36.480057 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:36 crc kubenswrapper[4841]: I1203 17:27:36.536347 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:38 crc kubenswrapper[4841]: I1203 17:27:38.440695 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w6vj8" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="registry-server" containerID="cri-o://a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980" gracePeriod=2 Dec 03 17:27:38 crc kubenswrapper[4841]: I1203 17:27:38.912329 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.073871 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities\") pod \"c130084e-a303-499f-82b4-881a66c923f8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.074062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88bsj\" (UniqueName: \"kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj\") pod \"c130084e-a303-499f-82b4-881a66c923f8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.074248 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content\") pod \"c130084e-a303-499f-82b4-881a66c923f8\" (UID: \"c130084e-a303-499f-82b4-881a66c923f8\") " Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.075162 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities" (OuterVolumeSpecName: "utilities") pod "c130084e-a303-499f-82b4-881a66c923f8" (UID: "c130084e-a303-499f-82b4-881a66c923f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.082197 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj" (OuterVolumeSpecName: "kube-api-access-88bsj") pod "c130084e-a303-499f-82b4-881a66c923f8" (UID: "c130084e-a303-499f-82b4-881a66c923f8"). InnerVolumeSpecName "kube-api-access-88bsj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.082506 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.108276 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c130084e-a303-499f-82b4-881a66c923f8" (UID: "c130084e-a303-499f-82b4-881a66c923f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.184786 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88bsj\" (UniqueName: \"kubernetes.io/projected/c130084e-a303-499f-82b4-881a66c923f8-kube-api-access-88bsj\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.184865 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c130084e-a303-499f-82b4-881a66c923f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.461782 4841 generic.go:334] "Generic (PLEG): container finished" podID="c130084e-a303-499f-82b4-881a66c923f8" containerID="a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980" exitCode=0 Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.461860 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerDied","Data":"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980"} Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.461897 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w6vj8" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.461954 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w6vj8" event={"ID":"c130084e-a303-499f-82b4-881a66c923f8","Type":"ContainerDied","Data":"11dffc6c33188af1e375c74a1244da3c946d032fbd6f02f9a6f393bf7cc311d8"} Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.461998 4841 scope.go:117] "RemoveContainer" containerID="a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.503350 4841 scope.go:117] "RemoveContainer" containerID="ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.530790 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.548854 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w6vj8"] Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.554056 4841 scope.go:117] "RemoveContainer" containerID="550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.601456 4841 scope.go:117] "RemoveContainer" containerID="a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980" Dec 03 17:27:39 crc kubenswrapper[4841]: E1203 17:27:39.602568 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980\": container with ID starting with a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980 not found: ID does not exist" containerID="a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.602691 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980"} err="failed to get container status \"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980\": rpc error: code = NotFound desc = could not find container \"a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980\": container with ID starting with a583731f7d3292f129fed04ff43dfb01adddd0c9000f6abae3ad629126c42980 not found: ID does not exist" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.602788 4841 scope.go:117] "RemoveContainer" containerID="ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3" Dec 03 17:27:39 crc kubenswrapper[4841]: E1203 17:27:39.603320 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3\": container with ID starting with ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3 not found: ID does not exist" containerID="ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.603383 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3"} err="failed to get container status \"ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3\": rpc error: code = NotFound desc = could not find container \"ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3\": container with ID starting with ae6552644f91a0c9269911b107157b118c4568e69957f08d8b275b4ec66d26f3 not found: ID does not exist" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.603427 4841 scope.go:117] "RemoveContainer" containerID="550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705" Dec 03 17:27:39 crc kubenswrapper[4841]: E1203 17:27:39.603832 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705\": container with ID starting with 550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705 not found: ID does not exist" containerID="550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705" Dec 03 17:27:39 crc kubenswrapper[4841]: I1203 17:27:39.603943 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705"} err="failed to get container status \"550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705\": rpc error: code = NotFound desc = could not find container \"550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705\": container with ID starting with 550736a01333f67117d1ba2875f110daac53535932b8bfe3b8ab002f625f3705 not found: ID does not exist" Dec 03 17:27:40 crc kubenswrapper[4841]: I1203 17:27:40.259003 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c130084e-a303-499f-82b4-881a66c923f8" path="/var/lib/kubelet/pods/c130084e-a303-499f-82b4-881a66c923f8/volumes" Dec 03 17:27:46 crc kubenswrapper[4841]: I1203 17:27:46.251485 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:27:46 crc kubenswrapper[4841]: E1203 17:27:46.252609 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:28:01 crc kubenswrapper[4841]: I1203 17:28:01.239493 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:28:01 crc kubenswrapper[4841]: E1203 17:28:01.240444 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:28:15 crc kubenswrapper[4841]: I1203 17:28:15.239273 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:28:15 crc kubenswrapper[4841]: E1203 17:28:15.240392 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:28:17 crc kubenswrapper[4841]: I1203 17:28:17.050636 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-j5km2"] Dec 03 17:28:17 crc kubenswrapper[4841]: I1203 17:28:17.066504 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-7phr6"] Dec 03 17:28:17 crc kubenswrapper[4841]: I1203 17:28:17.077729 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-j5km2"] Dec 03 17:28:17 crc kubenswrapper[4841]: I1203 17:28:17.089125 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-7phr6"] Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.051669 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ec5e-account-create-update-bl669"] Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.070086 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ec5e-account-create-update-bl669"] Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.087171 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-961c-account-create-update-t8l55"] Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.098033 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-961c-account-create-update-t8l55"] Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.252877 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0bf21be-1f9f-4a32-915f-9b8503211879" path="/var/lib/kubelet/pods/a0bf21be-1f9f-4a32-915f-9b8503211879/volumes" Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.253727 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1440ad0-f18c-4f97-8001-3a4aaf316279" path="/var/lib/kubelet/pods/b1440ad0-f18c-4f97-8001-3a4aaf316279/volumes" Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.256961 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f462ad2e-2c46-4083-8a4f-016cdadc719c" path="/var/lib/kubelet/pods/f462ad2e-2c46-4083-8a4f-016cdadc719c/volumes" Dec 03 17:28:18 crc kubenswrapper[4841]: I1203 17:28:18.257775 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb0a89b5-7071-4060-a386-ccf821af25ec" path="/var/lib/kubelet/pods/fb0a89b5-7071-4060-a386-ccf821af25ec/volumes" Dec 03 17:28:19 crc kubenswrapper[4841]: I1203 17:28:19.045533 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-nclcf"] Dec 03 17:28:19 crc kubenswrapper[4841]: I1203 17:28:19.063270 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-fa9f-account-create-update-fptp5"] Dec 03 17:28:19 crc kubenswrapper[4841]: I1203 17:28:19.073987 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-nclcf"] Dec 03 17:28:19 crc kubenswrapper[4841]: I1203 17:28:19.084560 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-fa9f-account-create-update-fptp5"] Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.257696 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="931a0b37-5c50-415b-ba38-71e3d2c7e632" path="/var/lib/kubelet/pods/931a0b37-5c50-415b-ba38-71e3d2c7e632/volumes" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.259720 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9121a48-14ed-4adc-8373-736c53b56e5b" path="/var/lib/kubelet/pods/a9121a48-14ed-4adc-8373-736c53b56e5b/volumes" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.425266 4841 scope.go:117] "RemoveContainer" containerID="edc6416659e53ba054e285f97f8cd4e0c3d23a1dc3157d2fcafb981d1241604d" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.460876 4841 scope.go:117] "RemoveContainer" containerID="8e91709042f4cd4d5e532608c308c2d5efdcce4cf621d22b7cb9965f57bb17bb" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.495533 4841 scope.go:117] "RemoveContainer" containerID="7165555f1237e0b598ec78e1f1bed7f727bffe28edcb072a41dbe91a401b3877" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.559344 4841 scope.go:117] "RemoveContainer" containerID="d2c934f166963e2218ba250d9fa9ca3b12cf8b7988efe72534d9a762133ac232" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.601428 4841 scope.go:117] "RemoveContainer" containerID="b2b40bfcca0e09f6a264d8ec9b8d1f4c487ba795d36cab64d42af07b4e29be10" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.642822 4841 scope.go:117] "RemoveContainer" containerID="766e6cef81d2558032d38deae49e281c5935af3c5b93907b90f0703155b28582" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.683246 4841 scope.go:117] "RemoveContainer" containerID="3525fdfda01e10f6ba829f873aecbe0f85518296cf0fa36401eb2ce5c991c5f9" Dec 03 17:28:20 crc kubenswrapper[4841]: I1203 17:28:20.736952 4841 scope.go:117] "RemoveContainer" containerID="88df786e863cbef3c18e98ce4e80c65ca7c6eacd28d6df949f8ac5a47e7cbb5d" Dec 03 17:28:29 crc kubenswrapper[4841]: I1203 17:28:29.240225 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:28:29 crc kubenswrapper[4841]: E1203 17:28:29.241387 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.067951 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-c5l22"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.081007 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d8ff-account-create-update-zsbgm"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.089790 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-c5l22"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.098995 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d8ff-account-create-update-zsbgm"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.106747 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-576f-account-create-update-q8hwq"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.114779 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-78vbd"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.122393 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-78vbd"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.129711 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-576f-account-create-update-q8hwq"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.136731 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-51cb-account-create-update-8zh96"] Dec 03 17:28:41 crc kubenswrapper[4841]: I1203 17:28:41.143925 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-51cb-account-create-update-8zh96"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.070552 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-2fp7p"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.088019 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-5830-account-create-update-2sv68"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.111029 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sl79f"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.124126 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-2fp7p"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.136190 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sl79f"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.146773 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-5830-account-create-update-2sv68"] Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.254135 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cd21556-0acb-47d3-8bd2-da1a675ac155" path="/var/lib/kubelet/pods/0cd21556-0acb-47d3-8bd2-da1a675ac155/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.256223 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a0cfe2-d206-4a46-b6ff-08a332049b44" path="/var/lib/kubelet/pods/18a0cfe2-d206-4a46-b6ff-08a332049b44/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.257830 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc94d86-48ee-4deb-9bf2-7606c6de4515" path="/var/lib/kubelet/pods/1fc94d86-48ee-4deb-9bf2-7606c6de4515/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.259366 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f5d52e5-cdf1-4148-8481-8e8fa5bda200" path="/var/lib/kubelet/pods/5f5d52e5-cdf1-4148-8481-8e8fa5bda200/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.262427 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b308ac2-d80a-4b8e-9c3b-065c794f6a00" path="/var/lib/kubelet/pods/7b308ac2-d80a-4b8e-9c3b-065c794f6a00/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.264215 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35" path="/var/lib/kubelet/pods/9fbe3cc3-b8b1-4d9d-a36b-5cd24cd3ff35/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.265992 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2aa5656-d3cf-43de-af5a-a9ba522ede37" path="/var/lib/kubelet/pods/b2aa5656-d3cf-43de-af5a-a9ba522ede37/volumes" Dec 03 17:28:42 crc kubenswrapper[4841]: I1203 17:28:42.273244 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54c21eb-8621-4959-a9be-de2efd6d1bb0" path="/var/lib/kubelet/pods/c54c21eb-8621-4959-a9be-de2efd6d1bb0/volumes" Dec 03 17:28:44 crc kubenswrapper[4841]: I1203 17:28:44.238762 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:28:44 crc kubenswrapper[4841]: E1203 17:28:44.239328 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:28:51 crc kubenswrapper[4841]: I1203 17:28:51.045944 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-6mxtf"] Dec 03 17:28:51 crc kubenswrapper[4841]: I1203 17:28:51.063752 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-6mxtf"] Dec 03 17:28:52 crc kubenswrapper[4841]: I1203 17:28:52.049018 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cpmwr"] Dec 03 17:28:52 crc kubenswrapper[4841]: I1203 17:28:52.071189 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cpmwr"] Dec 03 17:28:52 crc kubenswrapper[4841]: I1203 17:28:52.256413 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="336897e4-2c50-4739-b719-db8fa6b2389d" path="/var/lib/kubelet/pods/336897e4-2c50-4739-b719-db8fa6b2389d/volumes" Dec 03 17:28:52 crc kubenswrapper[4841]: I1203 17:28:52.258189 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db235153-a06b-4f9b-9129-76d9e7d7b1e4" path="/var/lib/kubelet/pods/db235153-a06b-4f9b-9129-76d9e7d7b1e4/volumes" Dec 03 17:28:56 crc kubenswrapper[4841]: I1203 17:28:56.247174 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:28:56 crc kubenswrapper[4841]: E1203 17:28:56.248303 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:29:05 crc kubenswrapper[4841]: I1203 17:29:05.443978 4841 generic.go:334] "Generic (PLEG): container finished" podID="dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" containerID="48df4b5b50b25eee51825c2ee068acfb0ddd6609c07bfb8eb66ae34ffa25e191" exitCode=0 Dec 03 17:29:05 crc kubenswrapper[4841]: I1203 17:29:05.444115 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" event={"ID":"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73","Type":"ContainerDied","Data":"48df4b5b50b25eee51825c2ee068acfb0ddd6609c07bfb8eb66ae34ffa25e191"} Dec 03 17:29:06 crc kubenswrapper[4841]: I1203 17:29:06.975719 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.140511 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory\") pod \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.140936 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key\") pod \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.141054 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vc4c\" (UniqueName: \"kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c\") pod \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\" (UID: \"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73\") " Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.147411 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c" (OuterVolumeSpecName: "kube-api-access-6vc4c") pod "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" (UID: "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73"). InnerVolumeSpecName "kube-api-access-6vc4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.177146 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" (UID: "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.177185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory" (OuterVolumeSpecName: "inventory") pod "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" (UID: "dfb9c7ef-19c9-4582-b13c-c399a2ef4e73"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.244006 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.244046 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.244057 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vc4c\" (UniqueName: \"kubernetes.io/projected/dfb9c7ef-19c9-4582-b13c-c399a2ef4e73-kube-api-access-6vc4c\") on node \"crc\" DevicePath \"\"" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.468552 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" event={"ID":"dfb9c7ef-19c9-4582-b13c-c399a2ef4e73","Type":"ContainerDied","Data":"081de971c39b8839fc634d7b61cf9abb62e23e1699d5d6bd66e23a31ea967990"} Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.468598 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="081de971c39b8839fc634d7b61cf9abb62e23e1699d5d6bd66e23a31ea967990" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.468655 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-sx49z" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.544169 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b"] Dec 03 17:29:07 crc kubenswrapper[4841]: E1203 17:29:07.544640 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.544661 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 17:29:07 crc kubenswrapper[4841]: E1203 17:29:07.544681 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="extract-utilities" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.544691 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="extract-utilities" Dec 03 17:29:07 crc kubenswrapper[4841]: E1203 17:29:07.544725 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="registry-server" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.544736 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="registry-server" Dec 03 17:29:07 crc kubenswrapper[4841]: E1203 17:29:07.544762 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="extract-content" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.544771 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="extract-content" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.545023 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfb9c7ef-19c9-4582-b13c-c399a2ef4e73" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.545046 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c130084e-a303-499f-82b4-881a66c923f8" containerName="registry-server" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.545792 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.548360 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.548362 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.548744 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.548892 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.552093 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b"] Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.657118 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpb6z\" (UniqueName: \"kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.657177 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.657212 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.758330 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpb6z\" (UniqueName: \"kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.758382 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.758417 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.762187 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.762396 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.774027 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpb6z\" (UniqueName: \"kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:07 crc kubenswrapper[4841]: I1203 17:29:07.875415 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:29:08 crc kubenswrapper[4841]: I1203 17:29:08.266217 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b"] Dec 03 17:29:08 crc kubenswrapper[4841]: I1203 17:29:08.481606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" event={"ID":"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d","Type":"ContainerStarted","Data":"6689740a4e74177ebbcb136ca64b49b7fb7b8958e18a4ed23bd28aa472873366"} Dec 03 17:29:09 crc kubenswrapper[4841]: I1203 17:29:09.492494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" event={"ID":"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d","Type":"ContainerStarted","Data":"08800eef312ec6eff6f6a9f0cbec1f01eaa3cb2041437b5472ef7c98553feda9"} Dec 03 17:29:09 crc kubenswrapper[4841]: I1203 17:29:09.511420 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" podStartSLOduration=2.089403371 podStartE2EDuration="2.511404099s" podCreationTimestamp="2025-12-03 17:29:07 +0000 UTC" firstStartedPulling="2025-12-03 17:29:08.261580809 +0000 UTC m=+1742.649101546" lastFinishedPulling="2025-12-03 17:29:08.683581547 +0000 UTC m=+1743.071102274" observedRunningTime="2025-12-03 17:29:09.506032548 +0000 UTC m=+1743.893553285" watchObservedRunningTime="2025-12-03 17:29:09.511404099 +0000 UTC m=+1743.898924826" Dec 03 17:29:11 crc kubenswrapper[4841]: I1203 17:29:11.238802 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:29:11 crc kubenswrapper[4841]: E1203 17:29:11.239397 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:29:20 crc kubenswrapper[4841]: I1203 17:29:20.049804 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-8dbcn"] Dec 03 17:29:20 crc kubenswrapper[4841]: I1203 17:29:20.061656 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-8dbcn"] Dec 03 17:29:20 crc kubenswrapper[4841]: I1203 17:29:20.251304 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6ce602c-aad4-4d9d-a924-a200b8d8658d" path="/var/lib/kubelet/pods/e6ce602c-aad4-4d9d-a924-a200b8d8658d/volumes" Dec 03 17:29:20 crc kubenswrapper[4841]: I1203 17:29:20.948530 4841 scope.go:117] "RemoveContainer" containerID="9de148a07dcb85b7edd57f5ca6a713f4e0997490566e3dab9624a50d096a551f" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.010493 4841 scope.go:117] "RemoveContainer" containerID="05436975c6cd7a79998e170cfe416d49688d2f7bf8c0e75e3a5abab4ff9239b8" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.080337 4841 scope.go:117] "RemoveContainer" containerID="a6924621b7a8a58ce8137d6ea9a099a6dfb362785e7fc3124c567f3822756d36" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.105147 4841 scope.go:117] "RemoveContainer" containerID="deb21a925338c7a6c78babba12d8feb024209e4c35d880ffd2ddfcacbfff810c" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.149364 4841 scope.go:117] "RemoveContainer" containerID="4881bbc5a13872a28cf19465ff6b462d195ad5e10d9c3c24f2fa0bb853686334" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.200729 4841 scope.go:117] "RemoveContainer" containerID="79832e253ef65b2abcf988bd7233b615a262a974cc3bd7deb32227f0165a0dbc" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.249343 4841 scope.go:117] "RemoveContainer" containerID="1bf8594f901a1a02da5e7767f236f191c820e2f1710a10a617f523c31dc52d84" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.279390 4841 scope.go:117] "RemoveContainer" containerID="b3059769bb5b094d93a59edee97b8c8e70d3b31d09cab6f378f9d3d21beee1c3" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.305055 4841 scope.go:117] "RemoveContainer" containerID="a70c177d1c91b8ef1be80f0994ac50afea348a143d80a2dc27df8c4536413465" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.333556 4841 scope.go:117] "RemoveContainer" containerID="2cf5426404531e12a46ee3a8e44d387c74a074af0054d3b84a14a5f6be7c6c34" Dec 03 17:29:21 crc kubenswrapper[4841]: I1203 17:29:21.370078 4841 scope.go:117] "RemoveContainer" containerID="cf21ee40cdc8d44dfe634a4873c13fa1b22953016a770dd55ac1a379e6c3804d" Dec 03 17:29:26 crc kubenswrapper[4841]: I1203 17:29:26.251759 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:29:26 crc kubenswrapper[4841]: E1203 17:29:26.253015 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:29:29 crc kubenswrapper[4841]: I1203 17:29:29.061295 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-8lggq"] Dec 03 17:29:29 crc kubenswrapper[4841]: I1203 17:29:29.073664 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-gdwlx"] Dec 03 17:29:29 crc kubenswrapper[4841]: I1203 17:29:29.087248 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-gdwlx"] Dec 03 17:29:29 crc kubenswrapper[4841]: I1203 17:29:29.098262 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-8lggq"] Dec 03 17:29:30 crc kubenswrapper[4841]: I1203 17:29:30.255118 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2fec744-9e89-4330-88a5-f0e4c2173870" path="/var/lib/kubelet/pods/b2fec744-9e89-4330-88a5-f0e4c2173870/volumes" Dec 03 17:29:30 crc kubenswrapper[4841]: I1203 17:29:30.256588 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c366f0b0-a1f7-4452-93c9-d408d117d651" path="/var/lib/kubelet/pods/c366f0b0-a1f7-4452-93c9-d408d117d651/volumes" Dec 03 17:29:38 crc kubenswrapper[4841]: I1203 17:29:38.240228 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:29:38 crc kubenswrapper[4841]: E1203 17:29:38.241535 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:29:39 crc kubenswrapper[4841]: I1203 17:29:39.045450 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-gj6sq"] Dec 03 17:29:39 crc kubenswrapper[4841]: I1203 17:29:39.060244 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-gj6sq"] Dec 03 17:29:40 crc kubenswrapper[4841]: I1203 17:29:40.048039 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-drrfd"] Dec 03 17:29:40 crc kubenswrapper[4841]: I1203 17:29:40.055794 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-drrfd"] Dec 03 17:29:40 crc kubenswrapper[4841]: I1203 17:29:40.256107 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa" path="/var/lib/kubelet/pods/084bb3a0-2558-43d3-a0f5-7ac7cd02c3fa/volumes" Dec 03 17:29:40 crc kubenswrapper[4841]: I1203 17:29:40.257722 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a7b606d-af6a-477c-9ac6-f93db645651d" path="/var/lib/kubelet/pods/6a7b606d-af6a-477c-9ac6-f93db645651d/volumes" Dec 03 17:29:42 crc kubenswrapper[4841]: I1203 17:29:42.034115 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-82zhd"] Dec 03 17:29:42 crc kubenswrapper[4841]: I1203 17:29:42.045927 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-82zhd"] Dec 03 17:29:42 crc kubenswrapper[4841]: I1203 17:29:42.256516 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c094953c-fc36-4dda-9497-381f9ae48471" path="/var/lib/kubelet/pods/c094953c-fc36-4dda-9497-381f9ae48471/volumes" Dec 03 17:29:53 crc kubenswrapper[4841]: I1203 17:29:53.239128 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:29:53 crc kubenswrapper[4841]: E1203 17:29:53.241519 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.149121 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr"] Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.150858 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.153475 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.154833 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.158349 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr"] Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.278799 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.279054 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkwpg\" (UniqueName: \"kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.279239 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.380775 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.380986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkwpg\" (UniqueName: \"kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.381155 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.382237 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.400236 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.405115 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkwpg\" (UniqueName: \"kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg\") pod \"collect-profiles-29413050-86wwr\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.476235 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:00 crc kubenswrapper[4841]: I1203 17:30:00.922890 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr"] Dec 03 17:30:01 crc kubenswrapper[4841]: I1203 17:30:01.108041 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" event={"ID":"3699b8e8-84f3-4772-ad5b-b8b02a370fcc","Type":"ContainerStarted","Data":"16d275afc93616ca79e27f448fd0bcdc91b152aa62889d3db976c8073ff2691c"} Dec 03 17:30:02 crc kubenswrapper[4841]: I1203 17:30:02.120007 4841 generic.go:334] "Generic (PLEG): container finished" podID="3699b8e8-84f3-4772-ad5b-b8b02a370fcc" containerID="f282fe2a8e87d1ee6870a6da682bb8dafbef493020f5e90e24987a965596689a" exitCode=0 Dec 03 17:30:02 crc kubenswrapper[4841]: I1203 17:30:02.120105 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" event={"ID":"3699b8e8-84f3-4772-ad5b-b8b02a370fcc","Type":"ContainerDied","Data":"f282fe2a8e87d1ee6870a6da682bb8dafbef493020f5e90e24987a965596689a"} Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.585263 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.643825 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume\") pod \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.643999 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkwpg\" (UniqueName: \"kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg\") pod \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.644078 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume\") pod \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\" (UID: \"3699b8e8-84f3-4772-ad5b-b8b02a370fcc\") " Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.644426 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume" (OuterVolumeSpecName: "config-volume") pod "3699b8e8-84f3-4772-ad5b-b8b02a370fcc" (UID: "3699b8e8-84f3-4772-ad5b-b8b02a370fcc"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.644584 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.650166 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg" (OuterVolumeSpecName: "kube-api-access-mkwpg") pod "3699b8e8-84f3-4772-ad5b-b8b02a370fcc" (UID: "3699b8e8-84f3-4772-ad5b-b8b02a370fcc"). InnerVolumeSpecName "kube-api-access-mkwpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.654079 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3699b8e8-84f3-4772-ad5b-b8b02a370fcc" (UID: "3699b8e8-84f3-4772-ad5b-b8b02a370fcc"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.746598 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkwpg\" (UniqueName: \"kubernetes.io/projected/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-kube-api-access-mkwpg\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:03 crc kubenswrapper[4841]: I1203 17:30:03.746640 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3699b8e8-84f3-4772-ad5b-b8b02a370fcc-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:04 crc kubenswrapper[4841]: I1203 17:30:04.149068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" event={"ID":"3699b8e8-84f3-4772-ad5b-b8b02a370fcc","Type":"ContainerDied","Data":"16d275afc93616ca79e27f448fd0bcdc91b152aa62889d3db976c8073ff2691c"} Dec 03 17:30:04 crc kubenswrapper[4841]: I1203 17:30:04.150194 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16d275afc93616ca79e27f448fd0bcdc91b152aa62889d3db976c8073ff2691c" Dec 03 17:30:04 crc kubenswrapper[4841]: I1203 17:30:04.149212 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr" Dec 03 17:30:05 crc kubenswrapper[4841]: I1203 17:30:05.238686 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:30:05 crc kubenswrapper[4841]: E1203 17:30:05.239071 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:17 crc kubenswrapper[4841]: I1203 17:30:17.239598 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:30:17 crc kubenswrapper[4841]: E1203 17:30:17.240717 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:21 crc kubenswrapper[4841]: I1203 17:30:21.695567 4841 scope.go:117] "RemoveContainer" containerID="f06c4fa375358c54eef0791779277a440dec87b0219d207917bc57f5f83f2919" Dec 03 17:30:21 crc kubenswrapper[4841]: I1203 17:30:21.758723 4841 scope.go:117] "RemoveContainer" containerID="4c9447f0ac6d6d48eaf6dae4a6ff308647d900b06d0b0415878fd555669c1cbd" Dec 03 17:30:21 crc kubenswrapper[4841]: I1203 17:30:21.836328 4841 scope.go:117] "RemoveContainer" containerID="49dbc947db12cfbe84efcd55391a15aa3b2a13e182496b08f555ed86684a8885" Dec 03 17:30:21 crc kubenswrapper[4841]: I1203 17:30:21.884268 4841 scope.go:117] "RemoveContainer" containerID="7ffa8405ccdd37a45e196fc903cd0c7bd7ab8e307ffe8880f5de6bb5ede6c861" Dec 03 17:30:21 crc kubenswrapper[4841]: I1203 17:30:21.935134 4841 scope.go:117] "RemoveContainer" containerID="f7b88de14b208885be22036f254026a8e37099177dcb084d9369893a26997d5e" Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.080542 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-a208-account-create-update-fzsjz"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.094261 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-njcv9"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.105148 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-cf3a-account-create-update-9qvw6"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.113451 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-f45bf"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.119890 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-a027-account-create-update-6gnsg"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.125875 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-cxq5q"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.131958 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-a208-account-create-update-fzsjz"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.137706 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-cf3a-account-create-update-9qvw6"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.143805 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-njcv9"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.149614 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-f45bf"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.155356 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-cxq5q"] Dec 03 17:30:27 crc kubenswrapper[4841]: I1203 17:30:27.160948 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-a027-account-create-update-6gnsg"] Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.250557 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b588dfb-c0c3-44c9-bb78-88f38c32d7d8" path="/var/lib/kubelet/pods/3b588dfb-c0c3-44c9-bb78-88f38c32d7d8/volumes" Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.251548 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b5e1317-cb39-41ff-bd74-0ff6f5888838" path="/var/lib/kubelet/pods/4b5e1317-cb39-41ff-bd74-0ff6f5888838/volumes" Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.252090 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8770bd7d-05a2-4cc4-9484-aad5e79ad58d" path="/var/lib/kubelet/pods/8770bd7d-05a2-4cc4-9484-aad5e79ad58d/volumes" Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.252624 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fa56c6b-2244-457e-a7ea-cf0f61bc8b10" path="/var/lib/kubelet/pods/8fa56c6b-2244-457e-a7ea-cf0f61bc8b10/volumes" Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.253617 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bad0d4a1-7f8d-4417-bbb2-df84240ddb39" path="/var/lib/kubelet/pods/bad0d4a1-7f8d-4417-bbb2-df84240ddb39/volumes" Dec 03 17:30:28 crc kubenswrapper[4841]: I1203 17:30:28.254179 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="defee520-123a-4d81-89c9-3ac1100f37ea" path="/var/lib/kubelet/pods/defee520-123a-4d81-89c9-3ac1100f37ea/volumes" Dec 03 17:30:31 crc kubenswrapper[4841]: I1203 17:30:31.238728 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:30:31 crc kubenswrapper[4841]: E1203 17:30:31.239209 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:31 crc kubenswrapper[4841]: I1203 17:30:31.450291 4841 generic.go:334] "Generic (PLEG): container finished" podID="b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" containerID="08800eef312ec6eff6f6a9f0cbec1f01eaa3cb2041437b5472ef7c98553feda9" exitCode=0 Dec 03 17:30:31 crc kubenswrapper[4841]: I1203 17:30:31.450340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" event={"ID":"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d","Type":"ContainerDied","Data":"08800eef312ec6eff6f6a9f0cbec1f01eaa3cb2041437b5472ef7c98553feda9"} Dec 03 17:30:32 crc kubenswrapper[4841]: I1203 17:30:32.975297 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.084975 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory\") pod \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.085048 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpb6z\" (UniqueName: \"kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z\") pod \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.085264 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key\") pod \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\" (UID: \"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d\") " Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.093135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z" (OuterVolumeSpecName: "kube-api-access-lpb6z") pod "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" (UID: "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d"). InnerVolumeSpecName "kube-api-access-lpb6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.114369 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" (UID: "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.116423 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory" (OuterVolumeSpecName: "inventory") pod "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" (UID: "b09b36ac-de85-4fa1-ab95-1c24c0c33c0d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.187015 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.187550 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.187644 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpb6z\" (UniqueName: \"kubernetes.io/projected/b09b36ac-de85-4fa1-ab95-1c24c0c33c0d-kube-api-access-lpb6z\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.480773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" event={"ID":"b09b36ac-de85-4fa1-ab95-1c24c0c33c0d","Type":"ContainerDied","Data":"6689740a4e74177ebbcb136ca64b49b7fb7b8958e18a4ed23bd28aa472873366"} Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.480817 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6689740a4e74177ebbcb136ca64b49b7fb7b8958e18a4ed23bd28aa472873366" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.480886 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.583318 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf"] Dec 03 17:30:33 crc kubenswrapper[4841]: E1203 17:30:33.583762 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3699b8e8-84f3-4772-ad5b-b8b02a370fcc" containerName="collect-profiles" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.583783 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3699b8e8-84f3-4772-ad5b-b8b02a370fcc" containerName="collect-profiles" Dec 03 17:30:33 crc kubenswrapper[4841]: E1203 17:30:33.583809 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.583819 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.584093 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3699b8e8-84f3-4772-ad5b-b8b02a370fcc" containerName="collect-profiles" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.584132 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b09b36ac-de85-4fa1-ab95-1c24c0c33c0d" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.584972 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.590120 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.590204 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.590895 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.592867 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.597891 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf"] Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.699664 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.699716 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s252h\" (UniqueName: \"kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.699845 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.802135 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.802220 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s252h\" (UniqueName: \"kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.802431 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.809340 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.809381 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.846298 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s252h\" (UniqueName: \"kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:33 crc kubenswrapper[4841]: I1203 17:30:33.901810 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:34 crc kubenswrapper[4841]: I1203 17:30:34.577521 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf"] Dec 03 17:30:35 crc kubenswrapper[4841]: I1203 17:30:35.502300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" event={"ID":"8880e946-3512-4dfc-9d56-c3210fd50e21","Type":"ContainerStarted","Data":"025345e1d32f33cb35324d7d68e0a25c2877238e364fc6aac67354d478b5cc95"} Dec 03 17:30:35 crc kubenswrapper[4841]: I1203 17:30:35.502743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" event={"ID":"8880e946-3512-4dfc-9d56-c3210fd50e21","Type":"ContainerStarted","Data":"dfd620a505e09b7d0f47a22b8e76027e58aa73904026f74d6cf151d1bcc12532"} Dec 03 17:30:35 crc kubenswrapper[4841]: I1203 17:30:35.525819 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" podStartSLOduration=2.091410622 podStartE2EDuration="2.525801172s" podCreationTimestamp="2025-12-03 17:30:33 +0000 UTC" firstStartedPulling="2025-12-03 17:30:34.582181785 +0000 UTC m=+1828.969702512" lastFinishedPulling="2025-12-03 17:30:35.016572335 +0000 UTC m=+1829.404093062" observedRunningTime="2025-12-03 17:30:35.520215906 +0000 UTC m=+1829.907736673" watchObservedRunningTime="2025-12-03 17:30:35.525801172 +0000 UTC m=+1829.913321899" Dec 03 17:30:41 crc kubenswrapper[4841]: I1203 17:30:41.563658 4841 generic.go:334] "Generic (PLEG): container finished" podID="8880e946-3512-4dfc-9d56-c3210fd50e21" containerID="025345e1d32f33cb35324d7d68e0a25c2877238e364fc6aac67354d478b5cc95" exitCode=0 Dec 03 17:30:41 crc kubenswrapper[4841]: I1203 17:30:41.563774 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" event={"ID":"8880e946-3512-4dfc-9d56-c3210fd50e21","Type":"ContainerDied","Data":"025345e1d32f33cb35324d7d68e0a25c2877238e364fc6aac67354d478b5cc95"} Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.080129 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.111340 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key\") pod \"8880e946-3512-4dfc-9d56-c3210fd50e21\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.111557 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory\") pod \"8880e946-3512-4dfc-9d56-c3210fd50e21\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.111652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s252h\" (UniqueName: \"kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h\") pod \"8880e946-3512-4dfc-9d56-c3210fd50e21\" (UID: \"8880e946-3512-4dfc-9d56-c3210fd50e21\") " Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.133239 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h" (OuterVolumeSpecName: "kube-api-access-s252h") pod "8880e946-3512-4dfc-9d56-c3210fd50e21" (UID: "8880e946-3512-4dfc-9d56-c3210fd50e21"). InnerVolumeSpecName "kube-api-access-s252h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.170027 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8880e946-3512-4dfc-9d56-c3210fd50e21" (UID: "8880e946-3512-4dfc-9d56-c3210fd50e21"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.175607 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory" (OuterVolumeSpecName: "inventory") pod "8880e946-3512-4dfc-9d56-c3210fd50e21" (UID: "8880e946-3512-4dfc-9d56-c3210fd50e21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.218614 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.218995 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8880e946-3512-4dfc-9d56-c3210fd50e21-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.219016 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s252h\" (UniqueName: \"kubernetes.io/projected/8880e946-3512-4dfc-9d56-c3210fd50e21-kube-api-access-s252h\") on node \"crc\" DevicePath \"\"" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.589366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" event={"ID":"8880e946-3512-4dfc-9d56-c3210fd50e21","Type":"ContainerDied","Data":"dfd620a505e09b7d0f47a22b8e76027e58aa73904026f74d6cf151d1bcc12532"} Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.589491 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd620a505e09b7d0f47a22b8e76027e58aa73904026f74d6cf151d1bcc12532" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.589480 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.776950 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds"] Dec 03 17:30:43 crc kubenswrapper[4841]: E1203 17:30:43.777423 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8880e946-3512-4dfc-9d56-c3210fd50e21" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.777445 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8880e946-3512-4dfc-9d56-c3210fd50e21" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.777738 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8880e946-3512-4dfc-9d56-c3210fd50e21" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.778508 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.783264 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.783326 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.783583 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.791014 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.795736 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds"] Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.831403 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.831468 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.831510 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sbqk\" (UniqueName: \"kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.933730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.933845 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sbqk\" (UniqueName: \"kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.934168 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.939695 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.940793 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:43 crc kubenswrapper[4841]: I1203 17:30:43.955390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sbqk\" (UniqueName: \"kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-r5qds\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:44 crc kubenswrapper[4841]: I1203 17:30:44.104045 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:30:44 crc kubenswrapper[4841]: I1203 17:30:44.240246 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:30:44 crc kubenswrapper[4841]: E1203 17:30:44.241409 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:44 crc kubenswrapper[4841]: I1203 17:30:44.822880 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds"] Dec 03 17:30:45 crc kubenswrapper[4841]: I1203 17:30:45.648956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" event={"ID":"ddd71965-3c25-46fe-a129-4e674bf7dcca","Type":"ContainerStarted","Data":"4620e7c096b0dc6e6558fd0c50a78e63e6e22254a699fdbd231b3b94e211aecb"} Dec 03 17:30:45 crc kubenswrapper[4841]: I1203 17:30:45.650158 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" event={"ID":"ddd71965-3c25-46fe-a129-4e674bf7dcca","Type":"ContainerStarted","Data":"dce4996cc3ef17b87ce0ba9bed1d63dc4c193d2219c8d534c943d512b6f17274"} Dec 03 17:30:45 crc kubenswrapper[4841]: I1203 17:30:45.683223 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" podStartSLOduration=2.2711278630000002 podStartE2EDuration="2.683201969s" podCreationTimestamp="2025-12-03 17:30:43 +0000 UTC" firstStartedPulling="2025-12-03 17:30:44.840022253 +0000 UTC m=+1839.227542990" lastFinishedPulling="2025-12-03 17:30:45.252096329 +0000 UTC m=+1839.639617096" observedRunningTime="2025-12-03 17:30:45.673077592 +0000 UTC m=+1840.060598359" watchObservedRunningTime="2025-12-03 17:30:45.683201969 +0000 UTC m=+1840.070722706" Dec 03 17:30:56 crc kubenswrapper[4841]: I1203 17:30:56.246865 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:30:56 crc kubenswrapper[4841]: E1203 17:30:56.248102 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:30:57 crc kubenswrapper[4841]: I1203 17:30:57.085765 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gddr9"] Dec 03 17:30:57 crc kubenswrapper[4841]: I1203 17:30:57.097620 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gddr9"] Dec 03 17:30:58 crc kubenswrapper[4841]: I1203 17:30:58.253295 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c71fa5a-194d-4d9f-a591-c4db053a2b01" path="/var/lib/kubelet/pods/1c71fa5a-194d-4d9f-a591-c4db053a2b01/volumes" Dec 03 17:31:09 crc kubenswrapper[4841]: I1203 17:31:09.239609 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:31:09 crc kubenswrapper[4841]: E1203 17:31:09.240570 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:31:14 crc kubenswrapper[4841]: I1203 17:31:14.057843 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-frcwp"] Dec 03 17:31:14 crc kubenswrapper[4841]: I1203 17:31:14.072486 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-frcwp"] Dec 03 17:31:14 crc kubenswrapper[4841]: I1203 17:31:14.256574 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21f13e1c-c61f-4ccc-881a-036eede4e140" path="/var/lib/kubelet/pods/21f13e1c-c61f-4ccc-881a-036eede4e140/volumes" Dec 03 17:31:16 crc kubenswrapper[4841]: I1203 17:31:16.035377 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d5k7k"] Dec 03 17:31:16 crc kubenswrapper[4841]: I1203 17:31:16.046265 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-d5k7k"] Dec 03 17:31:16 crc kubenswrapper[4841]: I1203 17:31:16.257318 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15" path="/var/lib/kubelet/pods/06eabf2f-e90f-4451-9f9e-ad9bc1a3dc15/volumes" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.133602 4841 scope.go:117] "RemoveContainer" containerID="e9a29522c92532e17e041a2e394eace09dfce8e8d095eac48349107022dd2729" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.173951 4841 scope.go:117] "RemoveContainer" containerID="76838e7afce317c8d2fb6cee10a9505ee704fa15b02fdfb938f26df657901729" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.235827 4841 scope.go:117] "RemoveContainer" containerID="0dfa8076e353712ef5974cfe0b2ae72a1df067ccc7df6f10311934ebd56b28b4" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.318061 4841 scope.go:117] "RemoveContainer" containerID="cbe654b0b644c77d1c3a980307cd97270ee6d59e76f24452b50391f0349cb1c3" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.354630 4841 scope.go:117] "RemoveContainer" containerID="e81b656705ef60f35ed3d450b402e5808b1cda91c2c3177cd557019368cd9b69" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.389804 4841 scope.go:117] "RemoveContainer" containerID="4637e098c6ee6399f08f3dfea946d0a93fa18ee9c880a15a020c8e732b91f40e" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.435943 4841 scope.go:117] "RemoveContainer" containerID="89e0789f1b8c8934114b13671a97a4776581e7656c70eba2d0e7d28c9ea4648c" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.487113 4841 scope.go:117] "RemoveContainer" containerID="d543f4ded22e1009abf6e163398e56c9a56bcbd5645826441422339c784a568d" Dec 03 17:31:22 crc kubenswrapper[4841]: I1203 17:31:22.526782 4841 scope.go:117] "RemoveContainer" containerID="435821fbe8b6d9cbd84e2fc94e04ec1826c70d793cac143da60cf134fb0cc891" Dec 03 17:31:23 crc kubenswrapper[4841]: I1203 17:31:23.238722 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:31:24 crc kubenswrapper[4841]: I1203 17:31:24.131004 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8"} Dec 03 17:31:30 crc kubenswrapper[4841]: I1203 17:31:30.206200 4841 generic.go:334] "Generic (PLEG): container finished" podID="ddd71965-3c25-46fe-a129-4e674bf7dcca" containerID="4620e7c096b0dc6e6558fd0c50a78e63e6e22254a699fdbd231b3b94e211aecb" exitCode=0 Dec 03 17:31:30 crc kubenswrapper[4841]: I1203 17:31:30.206324 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" event={"ID":"ddd71965-3c25-46fe-a129-4e674bf7dcca","Type":"ContainerDied","Data":"4620e7c096b0dc6e6558fd0c50a78e63e6e22254a699fdbd231b3b94e211aecb"} Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.698972 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.779870 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory\") pod \"ddd71965-3c25-46fe-a129-4e674bf7dcca\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.779945 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key\") pod \"ddd71965-3c25-46fe-a129-4e674bf7dcca\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.780052 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sbqk\" (UniqueName: \"kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk\") pod \"ddd71965-3c25-46fe-a129-4e674bf7dcca\" (UID: \"ddd71965-3c25-46fe-a129-4e674bf7dcca\") " Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.785072 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk" (OuterVolumeSpecName: "kube-api-access-8sbqk") pod "ddd71965-3c25-46fe-a129-4e674bf7dcca" (UID: "ddd71965-3c25-46fe-a129-4e674bf7dcca"). InnerVolumeSpecName "kube-api-access-8sbqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.805978 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "ddd71965-3c25-46fe-a129-4e674bf7dcca" (UID: "ddd71965-3c25-46fe-a129-4e674bf7dcca"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.809220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory" (OuterVolumeSpecName: "inventory") pod "ddd71965-3c25-46fe-a129-4e674bf7dcca" (UID: "ddd71965-3c25-46fe-a129-4e674bf7dcca"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.881719 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.881747 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/ddd71965-3c25-46fe-a129-4e674bf7dcca-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:31:31 crc kubenswrapper[4841]: I1203 17:31:31.881761 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sbqk\" (UniqueName: \"kubernetes.io/projected/ddd71965-3c25-46fe-a129-4e674bf7dcca-kube-api-access-8sbqk\") on node \"crc\" DevicePath \"\"" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.227396 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" event={"ID":"ddd71965-3c25-46fe-a129-4e674bf7dcca","Type":"ContainerDied","Data":"dce4996cc3ef17b87ce0ba9bed1d63dc4c193d2219c8d534c943d512b6f17274"} Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.227440 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dce4996cc3ef17b87ce0ba9bed1d63dc4c193d2219c8d534c943d512b6f17274" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.227417 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-r5qds" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.353881 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt"] Dec 03 17:31:32 crc kubenswrapper[4841]: E1203 17:31:32.354545 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd71965-3c25-46fe-a129-4e674bf7dcca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.354564 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd71965-3c25-46fe-a129-4e674bf7dcca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.354805 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd71965-3c25-46fe-a129-4e674bf7dcca" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.355651 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.358823 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.359150 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.359282 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.359406 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.368805 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt"] Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.493537 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.493636 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.493713 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swszt\" (UniqueName: \"kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.595773 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.595874 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swszt\" (UniqueName: \"kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.596001 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.601027 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.601536 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.619498 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swszt\" (UniqueName: \"kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:32 crc kubenswrapper[4841]: I1203 17:31:32.703895 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:31:33 crc kubenswrapper[4841]: I1203 17:31:33.233557 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt"] Dec 03 17:31:34 crc kubenswrapper[4841]: I1203 17:31:34.258189 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" event={"ID":"b0be8d7d-6270-4255-8c6d-6a50f8c741a2","Type":"ContainerStarted","Data":"489cf6cc4e0504bbbd236f02e2a8ed0b58dc4d07cb36611338ca15cb987a250d"} Dec 03 17:31:35 crc kubenswrapper[4841]: I1203 17:31:35.267361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" event={"ID":"b0be8d7d-6270-4255-8c6d-6a50f8c741a2","Type":"ContainerStarted","Data":"2a9bba0e5b97c836f25a3239005d419bd25cbef72bdbcdffcda2ff17c6b16f54"} Dec 03 17:31:35 crc kubenswrapper[4841]: I1203 17:31:35.295646 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" podStartSLOduration=1.639098108 podStartE2EDuration="3.295622892s" podCreationTimestamp="2025-12-03 17:31:32 +0000 UTC" firstStartedPulling="2025-12-03 17:31:33.238987134 +0000 UTC m=+1887.626507861" lastFinishedPulling="2025-12-03 17:31:34.895511878 +0000 UTC m=+1889.283032645" observedRunningTime="2025-12-03 17:31:35.287451782 +0000 UTC m=+1889.674972529" watchObservedRunningTime="2025-12-03 17:31:35.295622892 +0000 UTC m=+1889.683143619" Dec 03 17:31:59 crc kubenswrapper[4841]: I1203 17:31:59.046435 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-v74jz"] Dec 03 17:31:59 crc kubenswrapper[4841]: I1203 17:31:59.053684 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-v74jz"] Dec 03 17:32:00 crc kubenswrapper[4841]: I1203 17:32:00.253948 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126147ee-3dab-46a0-81c9-5e1e2793cd26" path="/var/lib/kubelet/pods/126147ee-3dab-46a0-81c9-5e1e2793cd26/volumes" Dec 03 17:32:22 crc kubenswrapper[4841]: I1203 17:32:22.757092 4841 scope.go:117] "RemoveContainer" containerID="3551c145eab7c0b36bc436335bc010966277c44494769c98ee30cc072d035b37" Dec 03 17:32:32 crc kubenswrapper[4841]: I1203 17:32:32.911216 4841 generic.go:334] "Generic (PLEG): container finished" podID="b0be8d7d-6270-4255-8c6d-6a50f8c741a2" containerID="2a9bba0e5b97c836f25a3239005d419bd25cbef72bdbcdffcda2ff17c6b16f54" exitCode=0 Dec 03 17:32:32 crc kubenswrapper[4841]: I1203 17:32:32.911340 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" event={"ID":"b0be8d7d-6270-4255-8c6d-6a50f8c741a2","Type":"ContainerDied","Data":"2a9bba0e5b97c836f25a3239005d419bd25cbef72bdbcdffcda2ff17c6b16f54"} Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.388269 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.556438 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory\") pod \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.556814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key\") pod \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.556880 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swszt\" (UniqueName: \"kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt\") pod \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\" (UID: \"b0be8d7d-6270-4255-8c6d-6a50f8c741a2\") " Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.578867 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt" (OuterVolumeSpecName: "kube-api-access-swszt") pod "b0be8d7d-6270-4255-8c6d-6a50f8c741a2" (UID: "b0be8d7d-6270-4255-8c6d-6a50f8c741a2"). InnerVolumeSpecName "kube-api-access-swszt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.606748 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0be8d7d-6270-4255-8c6d-6a50f8c741a2" (UID: "b0be8d7d-6270-4255-8c6d-6a50f8c741a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.607987 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory" (OuterVolumeSpecName: "inventory") pod "b0be8d7d-6270-4255-8c6d-6a50f8c741a2" (UID: "b0be8d7d-6270-4255-8c6d-6a50f8c741a2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.660356 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.660403 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.660422 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swszt\" (UniqueName: \"kubernetes.io/projected/b0be8d7d-6270-4255-8c6d-6a50f8c741a2-kube-api-access-swszt\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.938154 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" event={"ID":"b0be8d7d-6270-4255-8c6d-6a50f8c741a2","Type":"ContainerDied","Data":"489cf6cc4e0504bbbd236f02e2a8ed0b58dc4d07cb36611338ca15cb987a250d"} Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.938220 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt" Dec 03 17:32:34 crc kubenswrapper[4841]: I1203 17:32:34.938222 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489cf6cc4e0504bbbd236f02e2a8ed0b58dc4d07cb36611338ca15cb987a250d" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.045079 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8rlr"] Dec 03 17:32:35 crc kubenswrapper[4841]: E1203 17:32:35.045488 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0be8d7d-6270-4255-8c6d-6a50f8c741a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.045509 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0be8d7d-6270-4255-8c6d-6a50f8c741a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.045773 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0be8d7d-6270-4255-8c6d-6a50f8c741a2" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.046729 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.050308 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.050567 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.050783 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.050868 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.058764 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8rlr"] Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.209799 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.209868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2588\" (UniqueName: \"kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.210748 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.313425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.313543 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.313609 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2588\" (UniqueName: \"kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.318738 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.320136 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.343054 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2588\" (UniqueName: \"kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588\") pod \"ssh-known-hosts-edpm-deployment-x8rlr\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:35 crc kubenswrapper[4841]: I1203 17:32:35.435274 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:36 crc kubenswrapper[4841]: I1203 17:32:36.081049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-x8rlr"] Dec 03 17:32:36 crc kubenswrapper[4841]: W1203 17:32:36.082520 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc05eea2a_71d0_483b_a0b9_92b28743b13e.slice/crio-7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86 WatchSource:0}: Error finding container 7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86: Status 404 returned error can't find the container with id 7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86 Dec 03 17:32:36 crc kubenswrapper[4841]: I1203 17:32:36.085606 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:32:36 crc kubenswrapper[4841]: I1203 17:32:36.970199 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" event={"ID":"c05eea2a-71d0-483b-a0b9-92b28743b13e","Type":"ContainerStarted","Data":"db675c25e6f0b86413f662ed419f4b62fd71e3cbb89e6a62e1706044075493cc"} Dec 03 17:32:36 crc kubenswrapper[4841]: I1203 17:32:36.970734 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" event={"ID":"c05eea2a-71d0-483b-a0b9-92b28743b13e","Type":"ContainerStarted","Data":"7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86"} Dec 03 17:32:36 crc kubenswrapper[4841]: I1203 17:32:36.991867 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" podStartSLOduration=1.5223298650000001 podStartE2EDuration="1.991843312s" podCreationTimestamp="2025-12-03 17:32:35 +0000 UTC" firstStartedPulling="2025-12-03 17:32:36.085241529 +0000 UTC m=+1950.472762266" lastFinishedPulling="2025-12-03 17:32:36.554754986 +0000 UTC m=+1950.942275713" observedRunningTime="2025-12-03 17:32:36.989331881 +0000 UTC m=+1951.376852628" watchObservedRunningTime="2025-12-03 17:32:36.991843312 +0000 UTC m=+1951.379364059" Dec 03 17:32:45 crc kubenswrapper[4841]: I1203 17:32:45.068703 4841 generic.go:334] "Generic (PLEG): container finished" podID="c05eea2a-71d0-483b-a0b9-92b28743b13e" containerID="db675c25e6f0b86413f662ed419f4b62fd71e3cbb89e6a62e1706044075493cc" exitCode=0 Dec 03 17:32:45 crc kubenswrapper[4841]: I1203 17:32:45.068766 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" event={"ID":"c05eea2a-71d0-483b-a0b9-92b28743b13e","Type":"ContainerDied","Data":"db675c25e6f0b86413f662ed419f4b62fd71e3cbb89e6a62e1706044075493cc"} Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.534628 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.596138 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam\") pod \"c05eea2a-71d0-483b-a0b9-92b28743b13e\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.596259 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2588\" (UniqueName: \"kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588\") pod \"c05eea2a-71d0-483b-a0b9-92b28743b13e\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.596330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0\") pod \"c05eea2a-71d0-483b-a0b9-92b28743b13e\" (UID: \"c05eea2a-71d0-483b-a0b9-92b28743b13e\") " Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.602189 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588" (OuterVolumeSpecName: "kube-api-access-f2588") pod "c05eea2a-71d0-483b-a0b9-92b28743b13e" (UID: "c05eea2a-71d0-483b-a0b9-92b28743b13e"). InnerVolumeSpecName "kube-api-access-f2588". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.624603 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "c05eea2a-71d0-483b-a0b9-92b28743b13e" (UID: "c05eea2a-71d0-483b-a0b9-92b28743b13e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.626535 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c05eea2a-71d0-483b-a0b9-92b28743b13e" (UID: "c05eea2a-71d0-483b-a0b9-92b28743b13e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.697526 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.697565 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2588\" (UniqueName: \"kubernetes.io/projected/c05eea2a-71d0-483b-a0b9-92b28743b13e-kube-api-access-f2588\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:46 crc kubenswrapper[4841]: I1203 17:32:46.697579 4841 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/c05eea2a-71d0-483b-a0b9-92b28743b13e-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.093827 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" event={"ID":"c05eea2a-71d0-483b-a0b9-92b28743b13e","Type":"ContainerDied","Data":"7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86"} Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.093867 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c36fbe0afb2797548ae60aa4e4fdea204d88135307b63950c17cf7015673a86" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.093943 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-x8rlr" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.195334 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz"] Dec 03 17:32:47 crc kubenswrapper[4841]: E1203 17:32:47.195791 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c05eea2a-71d0-483b-a0b9-92b28743b13e" containerName="ssh-known-hosts-edpm-deployment" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.195810 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c05eea2a-71d0-483b-a0b9-92b28743b13e" containerName="ssh-known-hosts-edpm-deployment" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.196143 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c05eea2a-71d0-483b-a0b9-92b28743b13e" containerName="ssh-known-hosts-edpm-deployment" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.196872 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.200464 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.200567 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.201256 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.201716 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.209503 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7czd8\" (UniqueName: \"kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.210092 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.210434 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.215562 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz"] Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.312971 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.313080 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.313186 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7czd8\" (UniqueName: \"kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.321626 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.322258 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.346123 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7czd8\" (UniqueName: \"kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rf2mz\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:47 crc kubenswrapper[4841]: I1203 17:32:47.524575 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:48 crc kubenswrapper[4841]: I1203 17:32:48.146987 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz"] Dec 03 17:32:48 crc kubenswrapper[4841]: W1203 17:32:48.167077 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7cc5b5_8153_47ba_9f43_5e188c86d8c0.slice/crio-4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758 WatchSource:0}: Error finding container 4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758: Status 404 returned error can't find the container with id 4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758 Dec 03 17:32:49 crc kubenswrapper[4841]: I1203 17:32:49.116159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" event={"ID":"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0","Type":"ContainerStarted","Data":"d94509b9ec33e631f125a699348e03e1f151420052b757feb63fe83da7e35aee"} Dec 03 17:32:49 crc kubenswrapper[4841]: I1203 17:32:49.116464 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" event={"ID":"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0","Type":"ContainerStarted","Data":"4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758"} Dec 03 17:32:49 crc kubenswrapper[4841]: I1203 17:32:49.142254 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" podStartSLOduration=1.732512586 podStartE2EDuration="2.142236594s" podCreationTimestamp="2025-12-03 17:32:47 +0000 UTC" firstStartedPulling="2025-12-03 17:32:48.170752727 +0000 UTC m=+1962.558273464" lastFinishedPulling="2025-12-03 17:32:48.580476705 +0000 UTC m=+1962.967997472" observedRunningTime="2025-12-03 17:32:49.139428875 +0000 UTC m=+1963.526949632" watchObservedRunningTime="2025-12-03 17:32:49.142236594 +0000 UTC m=+1963.529757341" Dec 03 17:32:57 crc kubenswrapper[4841]: I1203 17:32:57.480530 4841 generic.go:334] "Generic (PLEG): container finished" podID="4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" containerID="d94509b9ec33e631f125a699348e03e1f151420052b757feb63fe83da7e35aee" exitCode=0 Dec 03 17:32:57 crc kubenswrapper[4841]: I1203 17:32:57.480645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" event={"ID":"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0","Type":"ContainerDied","Data":"d94509b9ec33e631f125a699348e03e1f151420052b757feb63fe83da7e35aee"} Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.869175 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.918521 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key\") pod \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.918566 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory\") pod \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.918715 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7czd8\" (UniqueName: \"kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8\") pod \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\" (UID: \"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0\") " Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.924491 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8" (OuterVolumeSpecName: "kube-api-access-7czd8") pod "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" (UID: "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0"). InnerVolumeSpecName "kube-api-access-7czd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.949595 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" (UID: "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:58 crc kubenswrapper[4841]: I1203 17:32:58.955699 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory" (OuterVolumeSpecName: "inventory") pod "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" (UID: "4f7cc5b5-8153-47ba-9f43-5e188c86d8c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.020265 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7czd8\" (UniqueName: \"kubernetes.io/projected/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-kube-api-access-7czd8\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.020435 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.020492 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4f7cc5b5-8153-47ba-9f43-5e188c86d8c0-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.501932 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" event={"ID":"4f7cc5b5-8153-47ba-9f43-5e188c86d8c0","Type":"ContainerDied","Data":"4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758"} Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.501995 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4417e4b7a89b79b0af1b1b79b01caef2dd9b89e278473bea03030d0053d8e758" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.502043 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rf2mz" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.648766 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp"] Dec 03 17:32:59 crc kubenswrapper[4841]: E1203 17:32:59.649151 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.649168 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.649345 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f7cc5b5-8153-47ba-9f43-5e188c86d8c0" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.649971 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.651881 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.652037 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.652536 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.653094 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.657930 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp"] Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.809656 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqgf\" (UniqueName: \"kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.809699 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.809768 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.912083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqgf\" (UniqueName: \"kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.912139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.912227 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.923962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:32:59 crc kubenswrapper[4841]: I1203 17:32:59.924802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:33:00 crc kubenswrapper[4841]: I1203 17:33:00.025982 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqgf\" (UniqueName: \"kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:33:00 crc kubenswrapper[4841]: I1203 17:33:00.263742 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:33:00 crc kubenswrapper[4841]: I1203 17:33:00.658119 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp"] Dec 03 17:33:01 crc kubenswrapper[4841]: I1203 17:33:01.527044 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" event={"ID":"c815359c-145d-48c6-936f-98c8f4cf8fff","Type":"ContainerStarted","Data":"9968f8b53945fb31e20222b25a1b19ebc8da0e776f4194944462be238340d3f8"} Dec 03 17:33:02 crc kubenswrapper[4841]: I1203 17:33:02.541548 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" event={"ID":"c815359c-145d-48c6-936f-98c8f4cf8fff","Type":"ContainerStarted","Data":"2e4975fbceed960d1f035a5780a0f8a610372111189a59720866c76175206586"} Dec 03 17:33:11 crc kubenswrapper[4841]: I1203 17:33:11.633819 4841 generic.go:334] "Generic (PLEG): container finished" podID="c815359c-145d-48c6-936f-98c8f4cf8fff" containerID="2e4975fbceed960d1f035a5780a0f8a610372111189a59720866c76175206586" exitCode=0 Dec 03 17:33:11 crc kubenswrapper[4841]: I1203 17:33:11.633968 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" event={"ID":"c815359c-145d-48c6-936f-98c8f4cf8fff","Type":"ContainerDied","Data":"2e4975fbceed960d1f035a5780a0f8a610372111189a59720866c76175206586"} Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.084845 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.148638 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqgf\" (UniqueName: \"kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf\") pod \"c815359c-145d-48c6-936f-98c8f4cf8fff\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.149113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key\") pod \"c815359c-145d-48c6-936f-98c8f4cf8fff\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.149145 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory\") pod \"c815359c-145d-48c6-936f-98c8f4cf8fff\" (UID: \"c815359c-145d-48c6-936f-98c8f4cf8fff\") " Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.164153 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf" (OuterVolumeSpecName: "kube-api-access-5sqgf") pod "c815359c-145d-48c6-936f-98c8f4cf8fff" (UID: "c815359c-145d-48c6-936f-98c8f4cf8fff"). InnerVolumeSpecName "kube-api-access-5sqgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.184058 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory" (OuterVolumeSpecName: "inventory") pod "c815359c-145d-48c6-936f-98c8f4cf8fff" (UID: "c815359c-145d-48c6-936f-98c8f4cf8fff"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.192279 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c815359c-145d-48c6-936f-98c8f4cf8fff" (UID: "c815359c-145d-48c6-936f-98c8f4cf8fff"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.251564 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.251600 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c815359c-145d-48c6-936f-98c8f4cf8fff-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.251613 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sqgf\" (UniqueName: \"kubernetes.io/projected/c815359c-145d-48c6-936f-98c8f4cf8fff-kube-api-access-5sqgf\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.658825 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" event={"ID":"c815359c-145d-48c6-936f-98c8f4cf8fff","Type":"ContainerDied","Data":"9968f8b53945fb31e20222b25a1b19ebc8da0e776f4194944462be238340d3f8"} Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.659303 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9968f8b53945fb31e20222b25a1b19ebc8da0e776f4194944462be238340d3f8" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.658902 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.797353 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7"] Dec 03 17:33:13 crc kubenswrapper[4841]: E1203 17:33:13.798108 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c815359c-145d-48c6-936f-98c8f4cf8fff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.798144 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c815359c-145d-48c6-936f-98c8f4cf8fff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.798474 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c815359c-145d-48c6-936f-98c8f4cf8fff" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.799609 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.805431 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.805631 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.805969 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.806345 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.806352 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.806438 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.806715 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.807143 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.815774 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7"] Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863187 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863259 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863408 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863440 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.863516 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864345 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864430 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncm8n\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864519 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864614 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.864876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.865110 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.865167 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.966192 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.966255 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.966284 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncm8n\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.966308 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.966344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967306 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967393 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967503 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967551 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967605 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967637 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.967712 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.971976 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.972652 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.972680 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.973173 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.974259 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.974272 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.974511 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.974786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.975751 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.976146 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.977765 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.979969 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.980335 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:13 crc kubenswrapper[4841]: I1203 17:33:13.985615 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncm8n\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:14 crc kubenswrapper[4841]: I1203 17:33:14.144882 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:33:14 crc kubenswrapper[4841]: I1203 17:33:14.689811 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7"] Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.025416 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.032102 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.039185 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.200198 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.203664 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqzq4\" (UniqueName: \"kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.203796 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.306406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.306628 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqzq4\" (UniqueName: \"kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.307056 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.307452 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.308103 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.326628 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqzq4\" (UniqueName: \"kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4\") pod \"community-operators-8dv57\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.413448 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.678657 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" event={"ID":"1e098ac8-ac99-4b82-8723-7171dbb84329","Type":"ContainerStarted","Data":"3c21d515753d2d11763e231f8c76b383115aa80ae0fdfcd588eb65313fbf0dbe"} Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.678983 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" event={"ID":"1e098ac8-ac99-4b82-8723-7171dbb84329","Type":"ContainerStarted","Data":"e4822cee3a93b3c9f7138e5214fe82da7484875f01ad106b1e492026734c57e7"} Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.710102 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" podStartSLOduration=2.257456009 podStartE2EDuration="2.710086715s" podCreationTimestamp="2025-12-03 17:33:13 +0000 UTC" firstStartedPulling="2025-12-03 17:33:14.692490262 +0000 UTC m=+1989.080010999" lastFinishedPulling="2025-12-03 17:33:15.145120978 +0000 UTC m=+1989.532641705" observedRunningTime="2025-12-03 17:33:15.701207198 +0000 UTC m=+1990.088727935" watchObservedRunningTime="2025-12-03 17:33:15.710086715 +0000 UTC m=+1990.097607442" Dec 03 17:33:15 crc kubenswrapper[4841]: I1203 17:33:15.759740 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:15 crc kubenswrapper[4841]: W1203 17:33:15.763337 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01cebd78_03c2_441d_8ed8_ee7e999636ca.slice/crio-d9f780d8fe4eb76da84ff8c7987f85896140bc1e494a3cb49dde79923d6a8e2b WatchSource:0}: Error finding container d9f780d8fe4eb76da84ff8c7987f85896140bc1e494a3cb49dde79923d6a8e2b: Status 404 returned error can't find the container with id d9f780d8fe4eb76da84ff8c7987f85896140bc1e494a3cb49dde79923d6a8e2b Dec 03 17:33:16 crc kubenswrapper[4841]: I1203 17:33:16.690230 4841 generic.go:334] "Generic (PLEG): container finished" podID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerID="22208e549952352b5fc7c6def2e8aed0a0be8ed37c1e08d29c05c896a38c15f1" exitCode=0 Dec 03 17:33:16 crc kubenswrapper[4841]: I1203 17:33:16.690297 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerDied","Data":"22208e549952352b5fc7c6def2e8aed0a0be8ed37c1e08d29c05c896a38c15f1"} Dec 03 17:33:16 crc kubenswrapper[4841]: I1203 17:33:16.690709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerStarted","Data":"d9f780d8fe4eb76da84ff8c7987f85896140bc1e494a3cb49dde79923d6a8e2b"} Dec 03 17:33:17 crc kubenswrapper[4841]: I1203 17:33:17.705346 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerStarted","Data":"b61e92f1b15872a03290921111002240a7437df779e43646ba94ed4a85f6d1ce"} Dec 03 17:33:18 crc kubenswrapper[4841]: I1203 17:33:18.719931 4841 generic.go:334] "Generic (PLEG): container finished" podID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerID="b61e92f1b15872a03290921111002240a7437df779e43646ba94ed4a85f6d1ce" exitCode=0 Dec 03 17:33:18 crc kubenswrapper[4841]: I1203 17:33:18.720073 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerDied","Data":"b61e92f1b15872a03290921111002240a7437df779e43646ba94ed4a85f6d1ce"} Dec 03 17:33:19 crc kubenswrapper[4841]: I1203 17:33:19.729714 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerStarted","Data":"20473583fb6ea1061d2e4d1b4ef0c4b9ff8f824a940b05c1e1faf79176f02c76"} Dec 03 17:33:19 crc kubenswrapper[4841]: I1203 17:33:19.749925 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8dv57" podStartSLOduration=3.316990588 podStartE2EDuration="5.749856746s" podCreationTimestamp="2025-12-03 17:33:14 +0000 UTC" firstStartedPulling="2025-12-03 17:33:16.692867807 +0000 UTC m=+1991.080388534" lastFinishedPulling="2025-12-03 17:33:19.125733925 +0000 UTC m=+1993.513254692" observedRunningTime="2025-12-03 17:33:19.748290467 +0000 UTC m=+1994.135811234" watchObservedRunningTime="2025-12-03 17:33:19.749856746 +0000 UTC m=+1994.137377473" Dec 03 17:33:25 crc kubenswrapper[4841]: I1203 17:33:25.415584 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:25 crc kubenswrapper[4841]: I1203 17:33:25.416303 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:25 crc kubenswrapper[4841]: I1203 17:33:25.508161 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:25 crc kubenswrapper[4841]: I1203 17:33:25.848935 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:28 crc kubenswrapper[4841]: I1203 17:33:28.412220 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:28 crc kubenswrapper[4841]: I1203 17:33:28.413138 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8dv57" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="registry-server" containerID="cri-o://20473583fb6ea1061d2e4d1b4ef0c4b9ff8f824a940b05c1e1faf79176f02c76" gracePeriod=2 Dec 03 17:33:28 crc kubenswrapper[4841]: I1203 17:33:28.833562 4841 generic.go:334] "Generic (PLEG): container finished" podID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerID="20473583fb6ea1061d2e4d1b4ef0c4b9ff8f824a940b05c1e1faf79176f02c76" exitCode=0 Dec 03 17:33:28 crc kubenswrapper[4841]: I1203 17:33:28.833641 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerDied","Data":"20473583fb6ea1061d2e4d1b4ef0c4b9ff8f824a940b05c1e1faf79176f02c76"} Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.431138 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.615635 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqzq4\" (UniqueName: \"kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4\") pod \"01cebd78-03c2-441d-8ed8-ee7e999636ca\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.615838 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content\") pod \"01cebd78-03c2-441d-8ed8-ee7e999636ca\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.615933 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities\") pod \"01cebd78-03c2-441d-8ed8-ee7e999636ca\" (UID: \"01cebd78-03c2-441d-8ed8-ee7e999636ca\") " Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.616724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities" (OuterVolumeSpecName: "utilities") pod "01cebd78-03c2-441d-8ed8-ee7e999636ca" (UID: "01cebd78-03c2-441d-8ed8-ee7e999636ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.625045 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4" (OuterVolumeSpecName: "kube-api-access-lqzq4") pod "01cebd78-03c2-441d-8ed8-ee7e999636ca" (UID: "01cebd78-03c2-441d-8ed8-ee7e999636ca"). InnerVolumeSpecName "kube-api-access-lqzq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.664155 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01cebd78-03c2-441d-8ed8-ee7e999636ca" (UID: "01cebd78-03c2-441d-8ed8-ee7e999636ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.718991 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.719044 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01cebd78-03c2-441d-8ed8-ee7e999636ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.719064 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqzq4\" (UniqueName: \"kubernetes.io/projected/01cebd78-03c2-441d-8ed8-ee7e999636ca-kube-api-access-lqzq4\") on node \"crc\" DevicePath \"\"" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.852876 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8dv57" event={"ID":"01cebd78-03c2-441d-8ed8-ee7e999636ca","Type":"ContainerDied","Data":"d9f780d8fe4eb76da84ff8c7987f85896140bc1e494a3cb49dde79923d6a8e2b"} Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.853269 4841 scope.go:117] "RemoveContainer" containerID="20473583fb6ea1061d2e4d1b4ef0c4b9ff8f824a940b05c1e1faf79176f02c76" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.852960 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8dv57" Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.922488 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.940539 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8dv57"] Dec 03 17:33:29 crc kubenswrapper[4841]: I1203 17:33:29.955081 4841 scope.go:117] "RemoveContainer" containerID="b61e92f1b15872a03290921111002240a7437df779e43646ba94ed4a85f6d1ce" Dec 03 17:33:30 crc kubenswrapper[4841]: I1203 17:33:30.056090 4841 scope.go:117] "RemoveContainer" containerID="22208e549952352b5fc7c6def2e8aed0a0be8ed37c1e08d29c05c896a38c15f1" Dec 03 17:33:30 crc kubenswrapper[4841]: I1203 17:33:30.253828 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" path="/var/lib/kubelet/pods/01cebd78-03c2-441d-8ed8-ee7e999636ca/volumes" Dec 03 17:33:39 crc kubenswrapper[4841]: I1203 17:33:39.316900 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:33:39 crc kubenswrapper[4841]: I1203 17:33:39.317609 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:33:59 crc kubenswrapper[4841]: I1203 17:33:59.189178 4841 generic.go:334] "Generic (PLEG): container finished" podID="1e098ac8-ac99-4b82-8723-7171dbb84329" containerID="3c21d515753d2d11763e231f8c76b383115aa80ae0fdfcd588eb65313fbf0dbe" exitCode=0 Dec 03 17:33:59 crc kubenswrapper[4841]: I1203 17:33:59.189305 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" event={"ID":"1e098ac8-ac99-4b82-8723-7171dbb84329","Type":"ContainerDied","Data":"3c21d515753d2d11763e231f8c76b383115aa80ae0fdfcd588eb65313fbf0dbe"} Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.748871 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.915009 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.915534 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.915652 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.915814 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncm8n\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916060 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916192 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916281 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916409 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916459 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916523 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916589 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.916683 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"1e098ac8-ac99-4b82-8723-7171dbb84329\" (UID: \"1e098ac8-ac99-4b82-8723-7171dbb84329\") " Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.923565 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n" (OuterVolumeSpecName: "kube-api-access-ncm8n") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "kube-api-access-ncm8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.923667 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.924666 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.924725 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.925595 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.927036 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.927829 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.927973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.929094 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.930051 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.931621 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.932084 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.972380 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:00 crc kubenswrapper[4841]: I1203 17:34:00.976149 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory" (OuterVolumeSpecName: "inventory") pod "1e098ac8-ac99-4b82-8723-7171dbb84329" (UID: "1e098ac8-ac99-4b82-8723-7171dbb84329"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020066 4841 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020102 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020149 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020163 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020178 4841 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020193 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020206 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020218 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020229 4841 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020242 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020252 4841 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020263 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncm8n\" (UniqueName: \"kubernetes.io/projected/1e098ac8-ac99-4b82-8723-7171dbb84329-kube-api-access-ncm8n\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020275 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.020285 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e098ac8-ac99-4b82-8723-7171dbb84329-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.214342 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" event={"ID":"1e098ac8-ac99-4b82-8723-7171dbb84329","Type":"ContainerDied","Data":"e4822cee3a93b3c9f7138e5214fe82da7484875f01ad106b1e492026734c57e7"} Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.214385 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4822cee3a93b3c9f7138e5214fe82da7484875f01ad106b1e492026734c57e7" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.214396 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346111 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc"] Dec 03 17:34:01 crc kubenswrapper[4841]: E1203 17:34:01.346540 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e098ac8-ac99-4b82-8723-7171dbb84329" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346566 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e098ac8-ac99-4b82-8723-7171dbb84329" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 17:34:01 crc kubenswrapper[4841]: E1203 17:34:01.346587 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="extract-content" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346596 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="extract-content" Dec 03 17:34:01 crc kubenswrapper[4841]: E1203 17:34:01.346625 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="extract-utilities" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346634 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="extract-utilities" Dec 03 17:34:01 crc kubenswrapper[4841]: E1203 17:34:01.346649 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="registry-server" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346658 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="registry-server" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.346884 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="01cebd78-03c2-441d-8ed8-ee7e999636ca" containerName="registry-server" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.347929 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e098ac8-ac99-4b82-8723-7171dbb84329" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.348589 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.351690 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.352237 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.352527 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.352808 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.353042 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.361333 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc"] Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.531202 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.531263 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.531498 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.531688 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.531815 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbx6q\" (UniqueName: \"kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.633646 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.633730 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.633853 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.634018 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.634083 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbx6q\" (UniqueName: \"kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.636238 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.640684 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.641830 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.642280 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.665323 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbx6q\" (UniqueName: \"kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-js7nc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:01 crc kubenswrapper[4841]: I1203 17:34:01.676234 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:34:02 crc kubenswrapper[4841]: W1203 17:34:02.262681 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14e63bf1_717a_40b6_8c5d_e46bf40c68dc.slice/crio-73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00 WatchSource:0}: Error finding container 73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00: Status 404 returned error can't find the container with id 73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00 Dec 03 17:34:02 crc kubenswrapper[4841]: I1203 17:34:02.276848 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc"] Dec 03 17:34:03 crc kubenswrapper[4841]: I1203 17:34:03.236453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" event={"ID":"14e63bf1-717a-40b6-8c5d-e46bf40c68dc","Type":"ContainerStarted","Data":"97fa70bd49cf9b7d40e82fbb0c75a04d1ed7a9f1f768ee1fe693feefeebc3973"} Dec 03 17:34:03 crc kubenswrapper[4841]: I1203 17:34:03.236803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" event={"ID":"14e63bf1-717a-40b6-8c5d-e46bf40c68dc","Type":"ContainerStarted","Data":"73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00"} Dec 03 17:34:03 crc kubenswrapper[4841]: I1203 17:34:03.262183 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" podStartSLOduration=1.785353986 podStartE2EDuration="2.262160681s" podCreationTimestamp="2025-12-03 17:34:01 +0000 UTC" firstStartedPulling="2025-12-03 17:34:02.267477338 +0000 UTC m=+2036.654998105" lastFinishedPulling="2025-12-03 17:34:02.744284053 +0000 UTC m=+2037.131804800" observedRunningTime="2025-12-03 17:34:03.255037387 +0000 UTC m=+2037.642558114" watchObservedRunningTime="2025-12-03 17:34:03.262160681 +0000 UTC m=+2037.649681408" Dec 03 17:34:09 crc kubenswrapper[4841]: I1203 17:34:09.316862 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:34:09 crc kubenswrapper[4841]: I1203 17:34:09.317677 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.707678 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.710551 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.718707 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.899702 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.899976 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:24 crc kubenswrapper[4841]: I1203 17:34:24.900040 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qqmt\" (UniqueName: \"kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.001460 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.001517 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qqmt\" (UniqueName: \"kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.001747 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.001962 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.002224 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.022782 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qqmt\" (UniqueName: \"kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt\") pod \"redhat-operators-8sjn5\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.040422 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:25 crc kubenswrapper[4841]: I1203 17:34:25.502709 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:26 crc kubenswrapper[4841]: I1203 17:34:26.498617 4841 generic.go:334] "Generic (PLEG): container finished" podID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerID="7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129" exitCode=0 Dec 03 17:34:26 crc kubenswrapper[4841]: I1203 17:34:26.498738 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerDied","Data":"7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129"} Dec 03 17:34:26 crc kubenswrapper[4841]: I1203 17:34:26.499138 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerStarted","Data":"9baddde74d97fab44b418629a6adb3e0e6639bf60fdeb2bc2fba03b4b4dcd3b2"} Dec 03 17:34:28 crc kubenswrapper[4841]: I1203 17:34:28.520690 4841 generic.go:334] "Generic (PLEG): container finished" podID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerID="baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224" exitCode=0 Dec 03 17:34:28 crc kubenswrapper[4841]: I1203 17:34:28.520772 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerDied","Data":"baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224"} Dec 03 17:34:29 crc kubenswrapper[4841]: I1203 17:34:29.552008 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerStarted","Data":"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe"} Dec 03 17:34:29 crc kubenswrapper[4841]: I1203 17:34:29.579084 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8sjn5" podStartSLOduration=3.1389404929999998 podStartE2EDuration="5.579063468s" podCreationTimestamp="2025-12-03 17:34:24 +0000 UTC" firstStartedPulling="2025-12-03 17:34:26.501035537 +0000 UTC m=+2060.888556274" lastFinishedPulling="2025-12-03 17:34:28.941158502 +0000 UTC m=+2063.328679249" observedRunningTime="2025-12-03 17:34:29.576607439 +0000 UTC m=+2063.964128166" watchObservedRunningTime="2025-12-03 17:34:29.579063468 +0000 UTC m=+2063.966584205" Dec 03 17:34:35 crc kubenswrapper[4841]: I1203 17:34:35.041478 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:35 crc kubenswrapper[4841]: I1203 17:34:35.042107 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:35 crc kubenswrapper[4841]: I1203 17:34:35.107538 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:35 crc kubenswrapper[4841]: I1203 17:34:35.689153 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:35 crc kubenswrapper[4841]: I1203 17:34:35.748736 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:37 crc kubenswrapper[4841]: I1203 17:34:37.651754 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8sjn5" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="registry-server" containerID="cri-o://f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe" gracePeriod=2 Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.114977 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.184836 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content\") pod \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.184995 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities\") pod \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.185087 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qqmt\" (UniqueName: \"kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt\") pod \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\" (UID: \"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17\") " Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.193031 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities" (OuterVolumeSpecName: "utilities") pod "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" (UID: "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.195109 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt" (OuterVolumeSpecName: "kube-api-access-9qqmt") pod "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" (UID: "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17"). InnerVolumeSpecName "kube-api-access-9qqmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.204451 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.204511 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qqmt\" (UniqueName: \"kubernetes.io/projected/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-kube-api-access-9qqmt\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.664483 4841 generic.go:334] "Generic (PLEG): container finished" podID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerID="f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe" exitCode=0 Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.664536 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerDied","Data":"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe"} Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.664562 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8sjn5" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.666087 4841 scope.go:117] "RemoveContainer" containerID="f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.666064 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8sjn5" event={"ID":"0305e9d7-7616-4fdf-a88c-6bd56b5e6f17","Type":"ContainerDied","Data":"9baddde74d97fab44b418629a6adb3e0e6639bf60fdeb2bc2fba03b4b4dcd3b2"} Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.696211 4841 scope.go:117] "RemoveContainer" containerID="baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.731860 4841 scope.go:117] "RemoveContainer" containerID="7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.795464 4841 scope.go:117] "RemoveContainer" containerID="f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe" Dec 03 17:34:38 crc kubenswrapper[4841]: E1203 17:34:38.796059 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe\": container with ID starting with f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe not found: ID does not exist" containerID="f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.796101 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe"} err="failed to get container status \"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe\": rpc error: code = NotFound desc = could not find container \"f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe\": container with ID starting with f2df2cc9bd988280d911a6700a799f86f279e2f7b792b3be4e7e4dfdbda51bbe not found: ID does not exist" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.796127 4841 scope.go:117] "RemoveContainer" containerID="baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224" Dec 03 17:34:38 crc kubenswrapper[4841]: E1203 17:34:38.796736 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224\": container with ID starting with baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224 not found: ID does not exist" containerID="baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.796960 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224"} err="failed to get container status \"baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224\": rpc error: code = NotFound desc = could not find container \"baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224\": container with ID starting with baa663e4c1ca232538215258b689de7dfb51d299a0d09bb27105db26c74c3224 not found: ID does not exist" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.797119 4841 scope.go:117] "RemoveContainer" containerID="7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129" Dec 03 17:34:38 crc kubenswrapper[4841]: E1203 17:34:38.797674 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129\": container with ID starting with 7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129 not found: ID does not exist" containerID="7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129" Dec 03 17:34:38 crc kubenswrapper[4841]: I1203 17:34:38.797701 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129"} err="failed to get container status \"7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129\": rpc error: code = NotFound desc = could not find container \"7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129\": container with ID starting with 7e36a7c80736caed54bc702c261be9a7bff2d65ff895137902c0936ac2b09129 not found: ID does not exist" Dec 03 17:34:39 crc kubenswrapper[4841]: I1203 17:34:39.316300 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:34:39 crc kubenswrapper[4841]: I1203 17:34:39.316400 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:34:39 crc kubenswrapper[4841]: I1203 17:34:39.316502 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:34:39 crc kubenswrapper[4841]: I1203 17:34:39.317637 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:34:39 crc kubenswrapper[4841]: I1203 17:34:39.317749 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8" gracePeriod=600 Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.318207 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" (UID: "0305e9d7-7616-4fdf-a88c-6bd56b5e6f17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.345298 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.499195 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.508776 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8sjn5"] Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.687589 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8" exitCode=0 Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.687667 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8"} Dec 03 17:34:40 crc kubenswrapper[4841]: I1203 17:34:40.688002 4841 scope.go:117] "RemoveContainer" containerID="1a7fdbbf2731ca3b40cbd9595336a664433bd5c744526dd5c32b09ef0705c686" Dec 03 17:34:41 crc kubenswrapper[4841]: I1203 17:34:41.733529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487"} Dec 03 17:34:42 crc kubenswrapper[4841]: I1203 17:34:42.260479 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" path="/var/lib/kubelet/pods/0305e9d7-7616-4fdf-a88c-6bd56b5e6f17/volumes" Dec 03 17:35:12 crc kubenswrapper[4841]: I1203 17:35:12.053529 4841 generic.go:334] "Generic (PLEG): container finished" podID="14e63bf1-717a-40b6-8c5d-e46bf40c68dc" containerID="97fa70bd49cf9b7d40e82fbb0c75a04d1ed7a9f1f768ee1fe693feefeebc3973" exitCode=0 Dec 03 17:35:12 crc kubenswrapper[4841]: I1203 17:35:12.053614 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" event={"ID":"14e63bf1-717a-40b6-8c5d-e46bf40c68dc","Type":"ContainerDied","Data":"97fa70bd49cf9b7d40e82fbb0c75a04d1ed7a9f1f768ee1fe693feefeebc3973"} Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.570696 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.635971 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0\") pod \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.636026 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key\") pod \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.636107 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory\") pod \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.636181 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbx6q\" (UniqueName: \"kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q\") pod \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.636204 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle\") pod \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\" (UID: \"14e63bf1-717a-40b6-8c5d-e46bf40c68dc\") " Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.645652 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q" (OuterVolumeSpecName: "kube-api-access-nbx6q") pod "14e63bf1-717a-40b6-8c5d-e46bf40c68dc" (UID: "14e63bf1-717a-40b6-8c5d-e46bf40c68dc"). InnerVolumeSpecName "kube-api-access-nbx6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.646070 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "14e63bf1-717a-40b6-8c5d-e46bf40c68dc" (UID: "14e63bf1-717a-40b6-8c5d-e46bf40c68dc"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.669109 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "14e63bf1-717a-40b6-8c5d-e46bf40c68dc" (UID: "14e63bf1-717a-40b6-8c5d-e46bf40c68dc"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.673202 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "14e63bf1-717a-40b6-8c5d-e46bf40c68dc" (UID: "14e63bf1-717a-40b6-8c5d-e46bf40c68dc"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.674229 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory" (OuterVolumeSpecName: "inventory") pod "14e63bf1-717a-40b6-8c5d-e46bf40c68dc" (UID: "14e63bf1-717a-40b6-8c5d-e46bf40c68dc"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.739416 4841 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.739461 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.739472 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.739482 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbx6q\" (UniqueName: \"kubernetes.io/projected/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-kube-api-access-nbx6q\") on node \"crc\" DevicePath \"\"" Dec 03 17:35:13 crc kubenswrapper[4841]: I1203 17:35:13.739494 4841 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/14e63bf1-717a-40b6-8c5d-e46bf40c68dc-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.088075 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" event={"ID":"14e63bf1-717a-40b6-8c5d-e46bf40c68dc","Type":"ContainerDied","Data":"73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00"} Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.088440 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d71f1930f0e3aea58d1d322d427e89f84167895769bb49ee046abf2a81bc00" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.088523 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-js7nc" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.187380 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt"] Dec 03 17:35:14 crc kubenswrapper[4841]: E1203 17:35:14.188057 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="registry-server" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188088 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="registry-server" Dec 03 17:35:14 crc kubenswrapper[4841]: E1203 17:35:14.188119 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14e63bf1-717a-40b6-8c5d-e46bf40c68dc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188135 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="14e63bf1-717a-40b6-8c5d-e46bf40c68dc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 17:35:14 crc kubenswrapper[4841]: E1203 17:35:14.188178 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="extract-content" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188193 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="extract-content" Dec 03 17:35:14 crc kubenswrapper[4841]: E1203 17:35:14.188243 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="extract-utilities" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188257 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="extract-utilities" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188584 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="14e63bf1-717a-40b6-8c5d-e46bf40c68dc" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.188623 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0305e9d7-7616-4fdf-a88c-6bd56b5e6f17" containerName="registry-server" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.189862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.192211 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.192389 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.192985 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.193073 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.193224 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.194246 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.213698 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt"] Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzg92\" (UniqueName: \"kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249651 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249720 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249760 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249798 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.249949 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.352126 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.352222 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.352918 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.352968 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.353214 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.353735 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzg92\" (UniqueName: \"kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.357026 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.357659 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.359760 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.361645 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.362685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.374068 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzg92\" (UniqueName: \"kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:14 crc kubenswrapper[4841]: I1203 17:35:14.514316 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:35:15 crc kubenswrapper[4841]: I1203 17:35:15.115251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt"] Dec 03 17:35:16 crc kubenswrapper[4841]: I1203 17:35:16.106408 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" event={"ID":"390fe67f-4d0d-459c-9f27-d6cc843c2d55","Type":"ContainerStarted","Data":"7851d4cd04e5f323eb74b6357489173b06cdc6c0a7a44e8e434c8433604cc9f4"} Dec 03 17:35:16 crc kubenswrapper[4841]: I1203 17:35:16.106716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" event={"ID":"390fe67f-4d0d-459c-9f27-d6cc843c2d55","Type":"ContainerStarted","Data":"d60eb89d4bfbe2a58954bd99fec9eb9587fd1ef20fe86b0cf478d156c5d0bcd4"} Dec 03 17:35:16 crc kubenswrapper[4841]: I1203 17:35:16.136483 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" podStartSLOduration=1.747469479 podStartE2EDuration="2.136463491s" podCreationTimestamp="2025-12-03 17:35:14 +0000 UTC" firstStartedPulling="2025-12-03 17:35:15.111409481 +0000 UTC m=+2109.498930218" lastFinishedPulling="2025-12-03 17:35:15.500403493 +0000 UTC m=+2109.887924230" observedRunningTime="2025-12-03 17:35:16.129228623 +0000 UTC m=+2110.516749380" watchObservedRunningTime="2025-12-03 17:35:16.136463491 +0000 UTC m=+2110.523984218" Dec 03 17:36:08 crc kubenswrapper[4841]: I1203 17:36:08.773742 4841 generic.go:334] "Generic (PLEG): container finished" podID="390fe67f-4d0d-459c-9f27-d6cc843c2d55" containerID="7851d4cd04e5f323eb74b6357489173b06cdc6c0a7a44e8e434c8433604cc9f4" exitCode=0 Dec 03 17:36:08 crc kubenswrapper[4841]: I1203 17:36:08.773870 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" event={"ID":"390fe67f-4d0d-459c-9f27-d6cc843c2d55","Type":"ContainerDied","Data":"7851d4cd04e5f323eb74b6357489173b06cdc6c0a7a44e8e434c8433604cc9f4"} Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.266858 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.412763 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.413536 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.413617 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.413747 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.413950 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.414027 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fzg92\" (UniqueName: \"kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92\") pod \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\" (UID: \"390fe67f-4d0d-459c-9f27-d6cc843c2d55\") " Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.419585 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92" (OuterVolumeSpecName: "kube-api-access-fzg92") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "kube-api-access-fzg92". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.423969 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fzg92\" (UniqueName: \"kubernetes.io/projected/390fe67f-4d0d-459c-9f27-d6cc843c2d55-kube-api-access-fzg92\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.435994 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.489184 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory" (OuterVolumeSpecName: "inventory") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.490665 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.502019 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.504618 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "390fe67f-4d0d-459c-9f27-d6cc843c2d55" (UID: "390fe67f-4d0d-459c-9f27-d6cc843c2d55"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.525852 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.525876 4841 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.525888 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.525896 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.525921 4841 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/390fe67f-4d0d-459c-9f27-d6cc843c2d55-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.803147 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" event={"ID":"390fe67f-4d0d-459c-9f27-d6cc843c2d55","Type":"ContainerDied","Data":"d60eb89d4bfbe2a58954bd99fec9eb9587fd1ef20fe86b0cf478d156c5d0bcd4"} Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.803196 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d60eb89d4bfbe2a58954bd99fec9eb9587fd1ef20fe86b0cf478d156c5d0bcd4" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.803268 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.914980 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s"] Dec 03 17:36:10 crc kubenswrapper[4841]: E1203 17:36:10.915565 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="390fe67f-4d0d-459c-9f27-d6cc843c2d55" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.915591 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="390fe67f-4d0d-459c-9f27-d6cc843c2d55" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.915870 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="390fe67f-4d0d-459c-9f27-d6cc843c2d55" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.916779 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.920768 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.921176 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.921306 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.921555 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.921679 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:36:10 crc kubenswrapper[4841]: I1203 17:36:10.926610 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s"] Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.034928 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.035206 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.035272 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c55zh\" (UniqueName: \"kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.035301 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.035615 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.137383 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.137435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.137492 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c55zh\" (UniqueName: \"kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.137520 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.137579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.142616 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.143613 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.143725 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.152619 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.170007 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c55zh\" (UniqueName: \"kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.270723 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:36:11 crc kubenswrapper[4841]: I1203 17:36:11.889723 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s"] Dec 03 17:36:12 crc kubenswrapper[4841]: I1203 17:36:12.825085 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" event={"ID":"fbcfe5cb-55b6-4840-ad0c-a916165933d6","Type":"ContainerStarted","Data":"f51d35701746bccda649c5bddbf57214b20389ceba8bd031030f685331f3f345"} Dec 03 17:36:13 crc kubenswrapper[4841]: I1203 17:36:13.841674 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" event={"ID":"fbcfe5cb-55b6-4840-ad0c-a916165933d6","Type":"ContainerStarted","Data":"d532b686dd0b045c14dcd96e744618cdf74c2e51df45f850d936dc9752fa27fb"} Dec 03 17:36:13 crc kubenswrapper[4841]: I1203 17:36:13.878186 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" podStartSLOduration=3.207518766 podStartE2EDuration="3.878156002s" podCreationTimestamp="2025-12-03 17:36:10 +0000 UTC" firstStartedPulling="2025-12-03 17:36:11.903400984 +0000 UTC m=+2166.290921751" lastFinishedPulling="2025-12-03 17:36:12.57403826 +0000 UTC m=+2166.961558987" observedRunningTime="2025-12-03 17:36:13.87196676 +0000 UTC m=+2168.259487537" watchObservedRunningTime="2025-12-03 17:36:13.878156002 +0000 UTC m=+2168.265676769" Dec 03 17:37:09 crc kubenswrapper[4841]: I1203 17:37:09.316988 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:37:09 crc kubenswrapper[4841]: I1203 17:37:09.317368 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:37:39 crc kubenswrapper[4841]: I1203 17:37:39.316413 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:37:39 crc kubenswrapper[4841]: I1203 17:37:39.317021 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:38:09 crc kubenswrapper[4841]: I1203 17:38:09.316683 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:38:09 crc kubenswrapper[4841]: I1203 17:38:09.317355 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:38:09 crc kubenswrapper[4841]: I1203 17:38:09.317446 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:38:09 crc kubenswrapper[4841]: I1203 17:38:09.318757 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:38:09 crc kubenswrapper[4841]: I1203 17:38:09.318872 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" gracePeriod=600 Dec 03 17:38:09 crc kubenswrapper[4841]: E1203 17:38:09.450736 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:38:10 crc kubenswrapper[4841]: I1203 17:38:10.134977 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" exitCode=0 Dec 03 17:38:10 crc kubenswrapper[4841]: I1203 17:38:10.135046 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487"} Dec 03 17:38:10 crc kubenswrapper[4841]: I1203 17:38:10.135466 4841 scope.go:117] "RemoveContainer" containerID="bed7cf0f01ebf759561d9f75934aee183d75629f6f360f64928e5bdd416653d8" Dec 03 17:38:10 crc kubenswrapper[4841]: I1203 17:38:10.136385 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:38:10 crc kubenswrapper[4841]: E1203 17:38:10.136798 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:38:21 crc kubenswrapper[4841]: I1203 17:38:21.239301 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:38:21 crc kubenswrapper[4841]: E1203 17:38:21.240646 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:38:34 crc kubenswrapper[4841]: I1203 17:38:34.240013 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:38:34 crc kubenswrapper[4841]: E1203 17:38:34.241308 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.033012 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.035796 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.040948 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.206776 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2smb\" (UniqueName: \"kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.206836 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.206891 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.308348 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2smb\" (UniqueName: \"kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.308390 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.308426 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.309099 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.309098 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.335183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2smb\" (UniqueName: \"kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb\") pod \"redhat-marketplace-7f9b8\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.360150 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:37 crc kubenswrapper[4841]: I1203 17:38:37.874171 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:38 crc kubenswrapper[4841]: I1203 17:38:38.468756 4841 generic.go:334] "Generic (PLEG): container finished" podID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerID="24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628" exitCode=0 Dec 03 17:38:38 crc kubenswrapper[4841]: I1203 17:38:38.468838 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerDied","Data":"24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628"} Dec 03 17:38:38 crc kubenswrapper[4841]: I1203 17:38:38.468884 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerStarted","Data":"53c2761711a18ceb14853bc97f7beb8bc35111d9a1af57585ea58636a44a193c"} Dec 03 17:38:38 crc kubenswrapper[4841]: I1203 17:38:38.473082 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:38:40 crc kubenswrapper[4841]: I1203 17:38:40.488329 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerStarted","Data":"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87"} Dec 03 17:38:41 crc kubenswrapper[4841]: I1203 17:38:41.499174 4841 generic.go:334] "Generic (PLEG): container finished" podID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerID="f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87" exitCode=0 Dec 03 17:38:41 crc kubenswrapper[4841]: I1203 17:38:41.499275 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerDied","Data":"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87"} Dec 03 17:38:42 crc kubenswrapper[4841]: I1203 17:38:42.512768 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerStarted","Data":"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e"} Dec 03 17:38:42 crc kubenswrapper[4841]: I1203 17:38:42.534381 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-7f9b8" podStartSLOduration=2.05812347 podStartE2EDuration="5.534356696s" podCreationTimestamp="2025-12-03 17:38:37 +0000 UTC" firstStartedPulling="2025-12-03 17:38:38.471573679 +0000 UTC m=+2312.859094446" lastFinishedPulling="2025-12-03 17:38:41.947806905 +0000 UTC m=+2316.335327672" observedRunningTime="2025-12-03 17:38:42.534031998 +0000 UTC m=+2316.921552765" watchObservedRunningTime="2025-12-03 17:38:42.534356696 +0000 UTC m=+2316.921877443" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.238632 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:38:47 crc kubenswrapper[4841]: E1203 17:38:47.239312 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.362138 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.362641 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.422318 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.630346 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:47 crc kubenswrapper[4841]: I1203 17:38:47.696354 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:49 crc kubenswrapper[4841]: I1203 17:38:49.583973 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-7f9b8" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="registry-server" containerID="cri-o://50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e" gracePeriod=2 Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.109245 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.289305 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2smb\" (UniqueName: \"kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb\") pod \"a80810f2-62ac-4d7e-abe8-889db385f5eb\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.289648 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities\") pod \"a80810f2-62ac-4d7e-abe8-889db385f5eb\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.289761 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content\") pod \"a80810f2-62ac-4d7e-abe8-889db385f5eb\" (UID: \"a80810f2-62ac-4d7e-abe8-889db385f5eb\") " Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.291247 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities" (OuterVolumeSpecName: "utilities") pod "a80810f2-62ac-4d7e-abe8-889db385f5eb" (UID: "a80810f2-62ac-4d7e-abe8-889db385f5eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.298637 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb" (OuterVolumeSpecName: "kube-api-access-t2smb") pod "a80810f2-62ac-4d7e-abe8-889db385f5eb" (UID: "a80810f2-62ac-4d7e-abe8-889db385f5eb"). InnerVolumeSpecName "kube-api-access-t2smb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.331724 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a80810f2-62ac-4d7e-abe8-889db385f5eb" (UID: "a80810f2-62ac-4d7e-abe8-889db385f5eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.393580 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.393634 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a80810f2-62ac-4d7e-abe8-889db385f5eb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.393708 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2smb\" (UniqueName: \"kubernetes.io/projected/a80810f2-62ac-4d7e-abe8-889db385f5eb-kube-api-access-t2smb\") on node \"crc\" DevicePath \"\"" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.601725 4841 generic.go:334] "Generic (PLEG): container finished" podID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerID="50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e" exitCode=0 Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.601803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerDied","Data":"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e"} Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.601815 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-7f9b8" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.601847 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-7f9b8" event={"ID":"a80810f2-62ac-4d7e-abe8-889db385f5eb","Type":"ContainerDied","Data":"53c2761711a18ceb14853bc97f7beb8bc35111d9a1af57585ea58636a44a193c"} Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.601876 4841 scope.go:117] "RemoveContainer" containerID="50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.626139 4841 scope.go:117] "RemoveContainer" containerID="f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.669455 4841 scope.go:117] "RemoveContainer" containerID="24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.683127 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.693525 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-7f9b8"] Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.719227 4841 scope.go:117] "RemoveContainer" containerID="50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e" Dec 03 17:38:50 crc kubenswrapper[4841]: E1203 17:38:50.719858 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e\": container with ID starting with 50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e not found: ID does not exist" containerID="50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.720011 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e"} err="failed to get container status \"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e\": rpc error: code = NotFound desc = could not find container \"50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e\": container with ID starting with 50b5ac72e1946c29fe4b5d295f421c61abe2f90685fd11e14c5c15de619f829e not found: ID does not exist" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.720045 4841 scope.go:117] "RemoveContainer" containerID="f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87" Dec 03 17:38:50 crc kubenswrapper[4841]: E1203 17:38:50.720476 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87\": container with ID starting with f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87 not found: ID does not exist" containerID="f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.720522 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87"} err="failed to get container status \"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87\": rpc error: code = NotFound desc = could not find container \"f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87\": container with ID starting with f018497bd46115c08a5204d0e975202f6f9c4ce00185c645a447ae5a20853d87 not found: ID does not exist" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.720550 4841 scope.go:117] "RemoveContainer" containerID="24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628" Dec 03 17:38:50 crc kubenswrapper[4841]: E1203 17:38:50.720999 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628\": container with ID starting with 24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628 not found: ID does not exist" containerID="24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628" Dec 03 17:38:50 crc kubenswrapper[4841]: I1203 17:38:50.721086 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628"} err="failed to get container status \"24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628\": rpc error: code = NotFound desc = could not find container \"24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628\": container with ID starting with 24f91f4d51b38e5e6953f63d5b5120bafc5de924a4f0e460f794777c357b1628 not found: ID does not exist" Dec 03 17:38:52 crc kubenswrapper[4841]: I1203 17:38:52.260540 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" path="/var/lib/kubelet/pods/a80810f2-62ac-4d7e-abe8-889db385f5eb/volumes" Dec 03 17:39:02 crc kubenswrapper[4841]: I1203 17:39:02.240168 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:39:02 crc kubenswrapper[4841]: E1203 17:39:02.241336 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:39:15 crc kubenswrapper[4841]: I1203 17:39:15.239513 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:39:15 crc kubenswrapper[4841]: E1203 17:39:15.240943 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:39:28 crc kubenswrapper[4841]: I1203 17:39:28.239559 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:39:28 crc kubenswrapper[4841]: E1203 17:39:28.240573 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:39:42 crc kubenswrapper[4841]: I1203 17:39:42.241450 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:39:42 crc kubenswrapper[4841]: E1203 17:39:42.242805 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:39:53 crc kubenswrapper[4841]: I1203 17:39:53.239124 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:39:53 crc kubenswrapper[4841]: E1203 17:39:53.240136 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:40:04 crc kubenswrapper[4841]: I1203 17:40:04.240603 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:40:04 crc kubenswrapper[4841]: E1203 17:40:04.242500 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:40:19 crc kubenswrapper[4841]: I1203 17:40:19.239870 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:40:19 crc kubenswrapper[4841]: E1203 17:40:19.242026 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:40:31 crc kubenswrapper[4841]: I1203 17:40:31.239510 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:40:31 crc kubenswrapper[4841]: E1203 17:40:31.240300 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:40:42 crc kubenswrapper[4841]: I1203 17:40:42.164604 4841 generic.go:334] "Generic (PLEG): container finished" podID="fbcfe5cb-55b6-4840-ad0c-a916165933d6" containerID="d532b686dd0b045c14dcd96e744618cdf74c2e51df45f850d936dc9752fa27fb" exitCode=0 Dec 03 17:40:42 crc kubenswrapper[4841]: I1203 17:40:42.164710 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" event={"ID":"fbcfe5cb-55b6-4840-ad0c-a916165933d6","Type":"ContainerDied","Data":"d532b686dd0b045c14dcd96e744618cdf74c2e51df45f850d936dc9752fa27fb"} Dec 03 17:40:42 crc kubenswrapper[4841]: I1203 17:40:42.239540 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:40:42 crc kubenswrapper[4841]: E1203 17:40:42.239877 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.598692 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.797229 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key\") pod \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.797442 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory\") pod \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.797502 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0\") pod \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.797823 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c55zh\" (UniqueName: \"kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh\") pod \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.798007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle\") pod \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\" (UID: \"fbcfe5cb-55b6-4840-ad0c-a916165933d6\") " Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.804952 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh" (OuterVolumeSpecName: "kube-api-access-c55zh") pod "fbcfe5cb-55b6-4840-ad0c-a916165933d6" (UID: "fbcfe5cb-55b6-4840-ad0c-a916165933d6"). InnerVolumeSpecName "kube-api-access-c55zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.805457 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fbcfe5cb-55b6-4840-ad0c-a916165933d6" (UID: "fbcfe5cb-55b6-4840-ad0c-a916165933d6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.830650 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory" (OuterVolumeSpecName: "inventory") pod "fbcfe5cb-55b6-4840-ad0c-a916165933d6" (UID: "fbcfe5cb-55b6-4840-ad0c-a916165933d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.849268 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "fbcfe5cb-55b6-4840-ad0c-a916165933d6" (UID: "fbcfe5cb-55b6-4840-ad0c-a916165933d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.858090 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "fbcfe5cb-55b6-4840-ad0c-a916165933d6" (UID: "fbcfe5cb-55b6-4840-ad0c-a916165933d6"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.901236 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c55zh\" (UniqueName: \"kubernetes.io/projected/fbcfe5cb-55b6-4840-ad0c-a916165933d6-kube-api-access-c55zh\") on node \"crc\" DevicePath \"\"" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.901285 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.901297 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.901308 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:40:43 crc kubenswrapper[4841]: I1203 17:40:43.901320 4841 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/fbcfe5cb-55b6-4840-ad0c-a916165933d6-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.211181 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" event={"ID":"fbcfe5cb-55b6-4840-ad0c-a916165933d6","Type":"ContainerDied","Data":"f51d35701746bccda649c5bddbf57214b20389ceba8bd031030f685331f3f345"} Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.211262 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.211272 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f51d35701746bccda649c5bddbf57214b20389ceba8bd031030f685331f3f345" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.296546 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk"] Dec 03 17:40:44 crc kubenswrapper[4841]: E1203 17:40:44.297245 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="registry-server" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.297333 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="registry-server" Dec 03 17:40:44 crc kubenswrapper[4841]: E1203 17:40:44.297420 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbcfe5cb-55b6-4840-ad0c-a916165933d6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.297489 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbcfe5cb-55b6-4840-ad0c-a916165933d6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 17:40:44 crc kubenswrapper[4841]: E1203 17:40:44.297604 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="extract-utilities" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.297691 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="extract-utilities" Dec 03 17:40:44 crc kubenswrapper[4841]: E1203 17:40:44.297771 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="extract-content" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.297843 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="extract-content" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.298153 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbcfe5cb-55b6-4840-ad0c-a916165933d6" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.298242 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80810f2-62ac-4d7e-abe8-889db385f5eb" containerName="registry-server" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.299168 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302302 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302335 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302311 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302393 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302394 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.302982 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.305010 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.306794 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.307306 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.307531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb9c5\" (UniqueName: \"kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.307631 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.307747 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.307874 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.308156 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.308283 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.308412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.320665 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk"] Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.410871 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb9c5\" (UniqueName: \"kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411494 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411530 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411590 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411682 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.411830 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.412785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.412787 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.414743 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.415197 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.415517 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.416013 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.416271 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.416410 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.416442 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.428877 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb9c5\" (UniqueName: \"kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5\") pod \"nova-edpm-deployment-openstack-edpm-ipam-7s8tk\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:44 crc kubenswrapper[4841]: I1203 17:40:44.629133 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:40:45 crc kubenswrapper[4841]: W1203 17:40:45.164873 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod986f7983_1ff5_4510_a8e9_0e45c0fddd19.slice/crio-30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d WatchSource:0}: Error finding container 30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d: Status 404 returned error can't find the container with id 30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d Dec 03 17:40:45 crc kubenswrapper[4841]: I1203 17:40:45.168436 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk"] Dec 03 17:40:45 crc kubenswrapper[4841]: I1203 17:40:45.220206 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" event={"ID":"986f7983-1ff5-4510-a8e9-0e45c0fddd19","Type":"ContainerStarted","Data":"30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d"} Dec 03 17:40:46 crc kubenswrapper[4841]: I1203 17:40:46.267515 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" event={"ID":"986f7983-1ff5-4510-a8e9-0e45c0fddd19","Type":"ContainerStarted","Data":"df1deb6c3ab59753f26bae2129ec84737c9d24ab3d6b43bad252250009001686"} Dec 03 17:40:47 crc kubenswrapper[4841]: I1203 17:40:47.304278 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" podStartSLOduration=2.79198614 podStartE2EDuration="3.30425336s" podCreationTimestamp="2025-12-03 17:40:44 +0000 UTC" firstStartedPulling="2025-12-03 17:40:45.167093094 +0000 UTC m=+2439.554613821" lastFinishedPulling="2025-12-03 17:40:45.679360304 +0000 UTC m=+2440.066881041" observedRunningTime="2025-12-03 17:40:47.297302419 +0000 UTC m=+2441.684823146" watchObservedRunningTime="2025-12-03 17:40:47.30425336 +0000 UTC m=+2441.691774097" Dec 03 17:40:54 crc kubenswrapper[4841]: I1203 17:40:54.238884 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:40:54 crc kubenswrapper[4841]: E1203 17:40:54.240007 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:41:07 crc kubenswrapper[4841]: I1203 17:41:07.239069 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:41:07 crc kubenswrapper[4841]: E1203 17:41:07.240142 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:41:19 crc kubenswrapper[4841]: I1203 17:41:19.239893 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:41:19 crc kubenswrapper[4841]: E1203 17:41:19.240990 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:41:34 crc kubenswrapper[4841]: I1203 17:41:34.239176 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:41:34 crc kubenswrapper[4841]: E1203 17:41:34.240008 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:41:49 crc kubenswrapper[4841]: I1203 17:41:49.239163 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:41:49 crc kubenswrapper[4841]: E1203 17:41:49.240101 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:42:03 crc kubenswrapper[4841]: I1203 17:42:03.239434 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:42:03 crc kubenswrapper[4841]: E1203 17:42:03.240135 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:42:17 crc kubenswrapper[4841]: I1203 17:42:17.239344 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:42:17 crc kubenswrapper[4841]: E1203 17:42:17.240227 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:42:29 crc kubenswrapper[4841]: I1203 17:42:29.239350 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:42:29 crc kubenswrapper[4841]: E1203 17:42:29.240210 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:42:40 crc kubenswrapper[4841]: I1203 17:42:40.239381 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:42:40 crc kubenswrapper[4841]: E1203 17:42:40.241234 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:42:52 crc kubenswrapper[4841]: I1203 17:42:52.239980 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:42:52 crc kubenswrapper[4841]: E1203 17:42:52.241048 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:43:03 crc kubenswrapper[4841]: I1203 17:43:03.238449 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:43:03 crc kubenswrapper[4841]: E1203 17:43:03.239397 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:43:17 crc kubenswrapper[4841]: I1203 17:43:17.238786 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:43:17 crc kubenswrapper[4841]: I1203 17:43:17.804043 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175"} Dec 03 17:43:42 crc kubenswrapper[4841]: I1203 17:43:42.077420 4841 generic.go:334] "Generic (PLEG): container finished" podID="986f7983-1ff5-4510-a8e9-0e45c0fddd19" containerID="df1deb6c3ab59753f26bae2129ec84737c9d24ab3d6b43bad252250009001686" exitCode=0 Dec 03 17:43:42 crc kubenswrapper[4841]: I1203 17:43:42.077730 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" event={"ID":"986f7983-1ff5-4510-a8e9-0e45c0fddd19","Type":"ContainerDied","Data":"df1deb6c3ab59753f26bae2129ec84737c9d24ab3d6b43bad252250009001686"} Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.548463 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.736080 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.736432 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb9c5\" (UniqueName: \"kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.736625 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.736776 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.737048 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.737286 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.737442 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.737567 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.737819 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0\") pod \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\" (UID: \"986f7983-1ff5-4510-a8e9-0e45c0fddd19\") " Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.744713 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.748009 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5" (OuterVolumeSpecName: "kube-api-access-mb9c5") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "kube-api-access-mb9c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.772775 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.773367 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.776702 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.786717 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.789643 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory" (OuterVolumeSpecName: "inventory") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.795264 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.800243 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "986f7983-1ff5-4510-a8e9-0e45c0fddd19" (UID: "986f7983-1ff5-4510-a8e9-0e45c0fddd19"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840651 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840747 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840764 4841 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840781 4841 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840794 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840806 4841 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840818 4841 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840829 4841 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/986f7983-1ff5-4510-a8e9-0e45c0fddd19-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:43 crc kubenswrapper[4841]: I1203 17:43:43.840843 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb9c5\" (UniqueName: \"kubernetes.io/projected/986f7983-1ff5-4510-a8e9-0e45c0fddd19-kube-api-access-mb9c5\") on node \"crc\" DevicePath \"\"" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.099337 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" event={"ID":"986f7983-1ff5-4510-a8e9-0e45c0fddd19","Type":"ContainerDied","Data":"30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d"} Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.099367 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-7s8tk" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.099390 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30a3975343a4a5adedb3f433e3511a21fab432ec884272921d5d053be49c119d" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.229846 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8"] Dec 03 17:43:44 crc kubenswrapper[4841]: E1203 17:43:44.230735 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986f7983-1ff5-4510-a8e9-0e45c0fddd19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.230767 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="986f7983-1ff5-4510-a8e9-0e45c0fddd19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.231200 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="986f7983-1ff5-4510-a8e9-0e45c0fddd19" containerName="nova-edpm-deployment-openstack-edpm-ipam" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.232367 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.235448 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.235507 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.236402 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.241397 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-9r52r" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.241630 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.260129 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8"] Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.349888 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.349974 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.350027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.350049 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.350085 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqg5v\" (UniqueName: \"kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.350105 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.350155 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452013 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452134 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452164 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqg5v\" (UniqueName: \"kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452184 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.452225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.457216 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.457333 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.457972 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.457998 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.459493 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.462694 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.471282 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqg5v\" (UniqueName: \"kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-82fk8\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.604068 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.974064 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8"] Dec 03 17:43:44 crc kubenswrapper[4841]: W1203 17:43:44.978280 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod902cbae6_47ea_4334_8623_8148bb196870.slice/crio-548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d WatchSource:0}: Error finding container 548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d: Status 404 returned error can't find the container with id 548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d Dec 03 17:43:44 crc kubenswrapper[4841]: I1203 17:43:44.980823 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:43:45 crc kubenswrapper[4841]: I1203 17:43:45.111320 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" event={"ID":"902cbae6-47ea-4334-8623-8148bb196870","Type":"ContainerStarted","Data":"548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d"} Dec 03 17:43:46 crc kubenswrapper[4841]: I1203 17:43:46.123780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" event={"ID":"902cbae6-47ea-4334-8623-8148bb196870","Type":"ContainerStarted","Data":"29484c0cc05ff20227aca6eccaaac3ec672eb133de9c975c1522ff469b096a5c"} Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.062821 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" podStartSLOduration=26.57146974 podStartE2EDuration="27.062801544s" podCreationTimestamp="2025-12-03 17:43:44 +0000 UTC" firstStartedPulling="2025-12-03 17:43:44.98048399 +0000 UTC m=+2619.368004737" lastFinishedPulling="2025-12-03 17:43:45.471815814 +0000 UTC m=+2619.859336541" observedRunningTime="2025-12-03 17:43:46.145816932 +0000 UTC m=+2620.533337659" watchObservedRunningTime="2025-12-03 17:44:11.062801544 +0000 UTC m=+2645.450322281" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.066947 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.071813 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.098805 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.133558 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.133604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq6qw\" (UniqueName: \"kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.133621 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.236866 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.237443 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq6qw\" (UniqueName: \"kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.237813 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.237384 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.238276 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.258264 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq6qw\" (UniqueName: \"kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw\") pod \"community-operators-zsmlf\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.411081 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:11 crc kubenswrapper[4841]: I1203 17:44:11.943916 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:12 crc kubenswrapper[4841]: I1203 17:44:12.388666 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerStarted","Data":"6b80c0d9b282b259fa9a3b0d9a7e020147dd86b21efa15112dba78b927cffd04"} Dec 03 17:44:13 crc kubenswrapper[4841]: I1203 17:44:13.410994 4841 generic.go:334] "Generic (PLEG): container finished" podID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerID="430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a" exitCode=0 Dec 03 17:44:13 crc kubenswrapper[4841]: I1203 17:44:13.411072 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerDied","Data":"430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a"} Dec 03 17:44:14 crc kubenswrapper[4841]: I1203 17:44:14.423627 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerStarted","Data":"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2"} Dec 03 17:44:15 crc kubenswrapper[4841]: I1203 17:44:15.436389 4841 generic.go:334] "Generic (PLEG): container finished" podID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerID="078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2" exitCode=0 Dec 03 17:44:15 crc kubenswrapper[4841]: I1203 17:44:15.436474 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerDied","Data":"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2"} Dec 03 17:44:16 crc kubenswrapper[4841]: I1203 17:44:16.451753 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerStarted","Data":"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695"} Dec 03 17:44:16 crc kubenswrapper[4841]: I1203 17:44:16.486250 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zsmlf" podStartSLOduration=3.014458162 podStartE2EDuration="5.486227666s" podCreationTimestamp="2025-12-03 17:44:11 +0000 UTC" firstStartedPulling="2025-12-03 17:44:13.412834494 +0000 UTC m=+2647.800355241" lastFinishedPulling="2025-12-03 17:44:15.884603978 +0000 UTC m=+2650.272124745" observedRunningTime="2025-12-03 17:44:16.476454535 +0000 UTC m=+2650.863975302" watchObservedRunningTime="2025-12-03 17:44:16.486227666 +0000 UTC m=+2650.873748403" Dec 03 17:44:21 crc kubenswrapper[4841]: I1203 17:44:21.412336 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:21 crc kubenswrapper[4841]: I1203 17:44:21.413023 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:21 crc kubenswrapper[4841]: I1203 17:44:21.487267 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:21 crc kubenswrapper[4841]: I1203 17:44:21.572186 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:21 crc kubenswrapper[4841]: I1203 17:44:21.739071 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:23 crc kubenswrapper[4841]: I1203 17:44:23.513867 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zsmlf" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="registry-server" containerID="cri-o://2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695" gracePeriod=2 Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.030861 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.103855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content\") pod \"2862c38a-c5de-4f21-bc10-5179dae95f23\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.103926 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities\") pod \"2862c38a-c5de-4f21-bc10-5179dae95f23\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.103999 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq6qw\" (UniqueName: \"kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw\") pod \"2862c38a-c5de-4f21-bc10-5179dae95f23\" (UID: \"2862c38a-c5de-4f21-bc10-5179dae95f23\") " Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.104696 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities" (OuterVolumeSpecName: "utilities") pod "2862c38a-c5de-4f21-bc10-5179dae95f23" (UID: "2862c38a-c5de-4f21-bc10-5179dae95f23"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.110070 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw" (OuterVolumeSpecName: "kube-api-access-mq6qw") pod "2862c38a-c5de-4f21-bc10-5179dae95f23" (UID: "2862c38a-c5de-4f21-bc10-5179dae95f23"). InnerVolumeSpecName "kube-api-access-mq6qw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.163944 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2862c38a-c5de-4f21-bc10-5179dae95f23" (UID: "2862c38a-c5de-4f21-bc10-5179dae95f23"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.206271 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.206468 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2862c38a-c5de-4f21-bc10-5179dae95f23-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.206528 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq6qw\" (UniqueName: \"kubernetes.io/projected/2862c38a-c5de-4f21-bc10-5179dae95f23-kube-api-access-mq6qw\") on node \"crc\" DevicePath \"\"" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.527276 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerDied","Data":"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695"} Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.527730 4841 scope.go:117] "RemoveContainer" containerID="2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.527317 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zsmlf" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.527240 4841 generic.go:334] "Generic (PLEG): container finished" podID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerID="2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695" exitCode=0 Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.528217 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zsmlf" event={"ID":"2862c38a-c5de-4f21-bc10-5179dae95f23","Type":"ContainerDied","Data":"6b80c0d9b282b259fa9a3b0d9a7e020147dd86b21efa15112dba78b927cffd04"} Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.554773 4841 scope.go:117] "RemoveContainer" containerID="078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.572216 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.583358 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zsmlf"] Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.587989 4841 scope.go:117] "RemoveContainer" containerID="430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.624211 4841 scope.go:117] "RemoveContainer" containerID="2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695" Dec 03 17:44:24 crc kubenswrapper[4841]: E1203 17:44:24.624700 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695\": container with ID starting with 2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695 not found: ID does not exist" containerID="2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.624734 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695"} err="failed to get container status \"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695\": rpc error: code = NotFound desc = could not find container \"2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695\": container with ID starting with 2ae11d75f605dd017b46e080113b952531c045ab6341941c0daed37592ce2695 not found: ID does not exist" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.624756 4841 scope.go:117] "RemoveContainer" containerID="078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2" Dec 03 17:44:24 crc kubenswrapper[4841]: E1203 17:44:24.626899 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2\": container with ID starting with 078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2 not found: ID does not exist" containerID="078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.626962 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2"} err="failed to get container status \"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2\": rpc error: code = NotFound desc = could not find container \"078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2\": container with ID starting with 078bf4d05beb404c673a67abce49dda214bff4fceaca3138e6d06ac6221f12f2 not found: ID does not exist" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.626998 4841 scope.go:117] "RemoveContainer" containerID="430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a" Dec 03 17:44:24 crc kubenswrapper[4841]: E1203 17:44:24.627449 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a\": container with ID starting with 430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a not found: ID does not exist" containerID="430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a" Dec 03 17:44:24 crc kubenswrapper[4841]: I1203 17:44:24.627493 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a"} err="failed to get container status \"430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a\": rpc error: code = NotFound desc = could not find container \"430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a\": container with ID starting with 430f9a6b9a99d3ca4cfdd627fc53411308df1436c4cb4c086e52dc23177a0b9a not found: ID does not exist" Dec 03 17:44:26 crc kubenswrapper[4841]: I1203 17:44:26.258274 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" path="/var/lib/kubelet/pods/2862c38a-c5de-4f21-bc10-5179dae95f23/volumes" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.147813 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk"] Dec 03 17:45:00 crc kubenswrapper[4841]: E1203 17:45:00.148662 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="extract-content" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.148674 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="extract-content" Dec 03 17:45:00 crc kubenswrapper[4841]: E1203 17:45:00.148708 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="registry-server" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.148715 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="registry-server" Dec 03 17:45:00 crc kubenswrapper[4841]: E1203 17:45:00.148730 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="extract-utilities" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.148737 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="extract-utilities" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.148974 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2862c38a-c5de-4f21-bc10-5179dae95f23" containerName="registry-server" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.149609 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.154603 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.154835 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.158378 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk"] Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.261048 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8pm\" (UniqueName: \"kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.261091 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.261395 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.363168 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8pm\" (UniqueName: \"kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.363246 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.363528 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.367482 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.375838 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.392891 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8pm\" (UniqueName: \"kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm\") pod \"collect-profiles-29413065-xbldk\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.477423 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:00 crc kubenswrapper[4841]: I1203 17:45:00.926326 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk"] Dec 03 17:45:01 crc kubenswrapper[4841]: I1203 17:45:01.924418 4841 generic.go:334] "Generic (PLEG): container finished" podID="5471c84b-c948-45a7-9efc-1c67410be07a" containerID="0e7b757fccccefda5507a023b5c67c336a802ecdac7d8da8e2abe31d4e46ddef" exitCode=0 Dec 03 17:45:01 crc kubenswrapper[4841]: I1203 17:45:01.924494 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" event={"ID":"5471c84b-c948-45a7-9efc-1c67410be07a","Type":"ContainerDied","Data":"0e7b757fccccefda5507a023b5c67c336a802ecdac7d8da8e2abe31d4e46ddef"} Dec 03 17:45:01 crc kubenswrapper[4841]: I1203 17:45:01.924777 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" event={"ID":"5471c84b-c948-45a7-9efc-1c67410be07a","Type":"ContainerStarted","Data":"348ac6ad0ac24f80fc734c4a1274a9742e8a3fd9e5f2c3f5855c756fd6b2e2c6"} Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.265880 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.323334 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume\") pod \"5471c84b-c948-45a7-9efc-1c67410be07a\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.323522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8pm\" (UniqueName: \"kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm\") pod \"5471c84b-c948-45a7-9efc-1c67410be07a\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.323607 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume\") pod \"5471c84b-c948-45a7-9efc-1c67410be07a\" (UID: \"5471c84b-c948-45a7-9efc-1c67410be07a\") " Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.325748 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume" (OuterVolumeSpecName: "config-volume") pod "5471c84b-c948-45a7-9efc-1c67410be07a" (UID: "5471c84b-c948-45a7-9efc-1c67410be07a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.330661 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5471c84b-c948-45a7-9efc-1c67410be07a" (UID: "5471c84b-c948-45a7-9efc-1c67410be07a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.330941 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm" (OuterVolumeSpecName: "kube-api-access-cx8pm") pod "5471c84b-c948-45a7-9efc-1c67410be07a" (UID: "5471c84b-c948-45a7-9efc-1c67410be07a"). InnerVolumeSpecName "kube-api-access-cx8pm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.425843 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx8pm\" (UniqueName: \"kubernetes.io/projected/5471c84b-c948-45a7-9efc-1c67410be07a-kube-api-access-cx8pm\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.425877 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5471c84b-c948-45a7-9efc-1c67410be07a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.425887 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5471c84b-c948-45a7-9efc-1c67410be07a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.947787 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" event={"ID":"5471c84b-c948-45a7-9efc-1c67410be07a","Type":"ContainerDied","Data":"348ac6ad0ac24f80fc734c4a1274a9742e8a3fd9e5f2c3f5855c756fd6b2e2c6"} Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.947847 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="348ac6ad0ac24f80fc734c4a1274a9742e8a3fd9e5f2c3f5855c756fd6b2e2c6" Dec 03 17:45:03 crc kubenswrapper[4841]: I1203 17:45:03.947858 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413065-xbldk" Dec 03 17:45:04 crc kubenswrapper[4841]: I1203 17:45:04.346094 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l"] Dec 03 17:45:04 crc kubenswrapper[4841]: I1203 17:45:04.355247 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413020-pdv9l"] Dec 03 17:45:06 crc kubenswrapper[4841]: I1203 17:45:06.378417 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad231084-6053-40b1-892c-284992b5df93" path="/var/lib/kubelet/pods/ad231084-6053-40b1-892c-284992b5df93/volumes" Dec 03 17:45:23 crc kubenswrapper[4841]: I1203 17:45:23.189990 4841 scope.go:117] "RemoveContainer" containerID="61f081450db7f42930a6fbe061d97e9fc539af5dbd2e07cfd7b35b62fb9c257d" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.811647 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:27 crc kubenswrapper[4841]: E1203 17:45:27.813065 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5471c84b-c948-45a7-9efc-1c67410be07a" containerName="collect-profiles" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.813086 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="5471c84b-c948-45a7-9efc-1c67410be07a" containerName="collect-profiles" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.813463 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="5471c84b-c948-45a7-9efc-1c67410be07a" containerName="collect-profiles" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.815606 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.846846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.861000 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.861206 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsqf8\" (UniqueName: \"kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.861269 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.962967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.963145 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsqf8\" (UniqueName: \"kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.963180 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.963542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.963739 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:27 crc kubenswrapper[4841]: I1203 17:45:27.995218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsqf8\" (UniqueName: \"kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8\") pod \"redhat-operators-5ph88\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:28 crc kubenswrapper[4841]: I1203 17:45:28.152347 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:28 crc kubenswrapper[4841]: I1203 17:45:28.627261 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:29 crc kubenswrapper[4841]: I1203 17:45:29.229712 4841 generic.go:334] "Generic (PLEG): container finished" podID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerID="33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c" exitCode=0 Dec 03 17:45:29 crc kubenswrapper[4841]: I1203 17:45:29.229807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerDied","Data":"33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c"} Dec 03 17:45:29 crc kubenswrapper[4841]: I1203 17:45:29.230020 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerStarted","Data":"4635eb99ee69b669dbd1fdefc0eb4d7c05658998e08ba579d7cb99094739e5da"} Dec 03 17:45:30 crc kubenswrapper[4841]: I1203 17:45:30.250446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerStarted","Data":"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d"} Dec 03 17:45:33 crc kubenswrapper[4841]: I1203 17:45:33.276937 4841 generic.go:334] "Generic (PLEG): container finished" podID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerID="a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d" exitCode=0 Dec 03 17:45:33 crc kubenswrapper[4841]: I1203 17:45:33.277029 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerDied","Data":"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d"} Dec 03 17:45:35 crc kubenswrapper[4841]: I1203 17:45:35.309606 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerStarted","Data":"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7"} Dec 03 17:45:35 crc kubenswrapper[4841]: I1203 17:45:35.337025 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5ph88" podStartSLOduration=3.155481138 podStartE2EDuration="8.3370009s" podCreationTimestamp="2025-12-03 17:45:27 +0000 UTC" firstStartedPulling="2025-12-03 17:45:29.231472541 +0000 UTC m=+2723.618993268" lastFinishedPulling="2025-12-03 17:45:34.412992303 +0000 UTC m=+2728.800513030" observedRunningTime="2025-12-03 17:45:35.32279054 +0000 UTC m=+2729.710311277" watchObservedRunningTime="2025-12-03 17:45:35.3370009 +0000 UTC m=+2729.724521637" Dec 03 17:45:38 crc kubenswrapper[4841]: I1203 17:45:38.153414 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:38 crc kubenswrapper[4841]: I1203 17:45:38.154007 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:39 crc kubenswrapper[4841]: I1203 17:45:39.204405 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5ph88" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="registry-server" probeResult="failure" output=< Dec 03 17:45:39 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:45:39 crc kubenswrapper[4841]: > Dec 03 17:45:39 crc kubenswrapper[4841]: I1203 17:45:39.316505 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:45:39 crc kubenswrapper[4841]: I1203 17:45:39.316571 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.842660 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.845611 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.860318 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.868619 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.868746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vc9dc\" (UniqueName: \"kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.868789 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.970769 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vc9dc\" (UniqueName: \"kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.970827 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.970943 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.971375 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.971405 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:43 crc kubenswrapper[4841]: I1203 17:45:43.989552 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vc9dc\" (UniqueName: \"kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc\") pod \"certified-operators-dxczx\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:44 crc kubenswrapper[4841]: I1203 17:45:44.174364 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:44 crc kubenswrapper[4841]: I1203 17:45:44.509765 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:45 crc kubenswrapper[4841]: I1203 17:45:45.431521 4841 generic.go:334] "Generic (PLEG): container finished" podID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerID="56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7" exitCode=0 Dec 03 17:45:45 crc kubenswrapper[4841]: I1203 17:45:45.431872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerDied","Data":"56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7"} Dec 03 17:45:45 crc kubenswrapper[4841]: I1203 17:45:45.432119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerStarted","Data":"dbd10742d56551bc5bb571d5f2dc7f4aa02d002da5f28495e859f7b84b9cd900"} Dec 03 17:45:46 crc kubenswrapper[4841]: I1203 17:45:46.445857 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerStarted","Data":"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c"} Dec 03 17:45:47 crc kubenswrapper[4841]: I1203 17:45:47.463307 4841 generic.go:334] "Generic (PLEG): container finished" podID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerID="0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c" exitCode=0 Dec 03 17:45:47 crc kubenswrapper[4841]: I1203 17:45:47.463395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerDied","Data":"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c"} Dec 03 17:45:48 crc kubenswrapper[4841]: I1203 17:45:48.204134 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:48 crc kubenswrapper[4841]: I1203 17:45:48.252719 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:48 crc kubenswrapper[4841]: I1203 17:45:48.477997 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerStarted","Data":"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6"} Dec 03 17:45:50 crc kubenswrapper[4841]: I1203 17:45:50.614540 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxczx" podStartSLOduration=5.194711695 podStartE2EDuration="7.614520064s" podCreationTimestamp="2025-12-03 17:45:43 +0000 UTC" firstStartedPulling="2025-12-03 17:45:45.433426549 +0000 UTC m=+2739.820947306" lastFinishedPulling="2025-12-03 17:45:47.853234948 +0000 UTC m=+2742.240755675" observedRunningTime="2025-12-03 17:45:48.52175453 +0000 UTC m=+2742.909275257" watchObservedRunningTime="2025-12-03 17:45:50.614520064 +0000 UTC m=+2745.002040791" Dec 03 17:45:50 crc kubenswrapper[4841]: I1203 17:45:50.615267 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:50 crc kubenswrapper[4841]: I1203 17:45:50.615466 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5ph88" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="registry-server" containerID="cri-o://d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7" gracePeriod=2 Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.357697 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.511446 4841 generic.go:334] "Generic (PLEG): container finished" podID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerID="d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7" exitCode=0 Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.511502 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerDied","Data":"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7"} Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.511534 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5ph88" event={"ID":"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8","Type":"ContainerDied","Data":"4635eb99ee69b669dbd1fdefc0eb4d7c05658998e08ba579d7cb99094739e5da"} Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.511567 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5ph88" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.511576 4841 scope.go:117] "RemoveContainer" containerID="d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.537946 4841 scope.go:117] "RemoveContainer" containerID="a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.542584 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities\") pod \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.542659 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content\") pod \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.542902 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsqf8\" (UniqueName: \"kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8\") pod \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\" (UID: \"96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8\") " Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.543266 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities" (OuterVolumeSpecName: "utilities") pod "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" (UID: "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.543865 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.550373 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8" (OuterVolumeSpecName: "kube-api-access-tsqf8") pod "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" (UID: "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8"). InnerVolumeSpecName "kube-api-access-tsqf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.558987 4841 scope.go:117] "RemoveContainer" containerID="33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.639075 4841 scope.go:117] "RemoveContainer" containerID="d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7" Dec 03 17:45:51 crc kubenswrapper[4841]: E1203 17:45:51.639626 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7\": container with ID starting with d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7 not found: ID does not exist" containerID="d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.639702 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7"} err="failed to get container status \"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7\": rpc error: code = NotFound desc = could not find container \"d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7\": container with ID starting with d7355cdcff34b5e86712252cbdc241d1289b6040841683f8ac25983e41ffa2f7 not found: ID does not exist" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.639740 4841 scope.go:117] "RemoveContainer" containerID="a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d" Dec 03 17:45:51 crc kubenswrapper[4841]: E1203 17:45:51.641695 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d\": container with ID starting with a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d not found: ID does not exist" containerID="a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.641733 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d"} err="failed to get container status \"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d\": rpc error: code = NotFound desc = could not find container \"a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d\": container with ID starting with a5de30571bbf28417cf6583b15fd0ef58dd9e10788d484576c93ff87bf64298d not found: ID does not exist" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.641765 4841 scope.go:117] "RemoveContainer" containerID="33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c" Dec 03 17:45:51 crc kubenswrapper[4841]: E1203 17:45:51.642371 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c\": container with ID starting with 33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c not found: ID does not exist" containerID="33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.642427 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c"} err="failed to get container status \"33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c\": rpc error: code = NotFound desc = could not find container \"33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c\": container with ID starting with 33b9917fb15d03392aa5a575697e166fa0036ceaeda024a157f304a5a211338c not found: ID does not exist" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.645720 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsqf8\" (UniqueName: \"kubernetes.io/projected/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-kube-api-access-tsqf8\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.668121 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" (UID: "96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.747976 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.852078 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:51 crc kubenswrapper[4841]: I1203 17:45:51.860620 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5ph88"] Dec 03 17:45:52 crc kubenswrapper[4841]: I1203 17:45:52.249281 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" path="/var/lib/kubelet/pods/96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8/volumes" Dec 03 17:45:54 crc kubenswrapper[4841]: I1203 17:45:54.175277 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:54 crc kubenswrapper[4841]: I1203 17:45:54.175691 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:54 crc kubenswrapper[4841]: I1203 17:45:54.258484 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:54 crc kubenswrapper[4841]: I1203 17:45:54.611397 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:55 crc kubenswrapper[4841]: I1203 17:45:55.417684 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:56 crc kubenswrapper[4841]: I1203 17:45:56.565124 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dxczx" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="registry-server" containerID="cri-o://b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6" gracePeriod=2 Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.185507 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.360142 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities\") pod \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.360219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vc9dc\" (UniqueName: \"kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc\") pod \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.360343 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content\") pod \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\" (UID: \"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078\") " Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.360887 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities" (OuterVolumeSpecName: "utilities") pod "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" (UID: "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.362114 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.367741 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc" (OuterVolumeSpecName: "kube-api-access-vc9dc") pod "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" (UID: "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078"). InnerVolumeSpecName "kube-api-access-vc9dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.450408 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" (UID: "34d4d5ac-e5a6-4347-8e36-5a5d3d70f078"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.464278 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vc9dc\" (UniqueName: \"kubernetes.io/projected/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-kube-api-access-vc9dc\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.464392 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.577495 4841 generic.go:334] "Generic (PLEG): container finished" podID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerID="b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6" exitCode=0 Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.577543 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxczx" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.577570 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerDied","Data":"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6"} Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.578024 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxczx" event={"ID":"34d4d5ac-e5a6-4347-8e36-5a5d3d70f078","Type":"ContainerDied","Data":"dbd10742d56551bc5bb571d5f2dc7f4aa02d002da5f28495e859f7b84b9cd900"} Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.578075 4841 scope.go:117] "RemoveContainer" containerID="b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.602944 4841 scope.go:117] "RemoveContainer" containerID="0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.629327 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.642166 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dxczx"] Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.644030 4841 scope.go:117] "RemoveContainer" containerID="56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.716284 4841 scope.go:117] "RemoveContainer" containerID="b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6" Dec 03 17:45:57 crc kubenswrapper[4841]: E1203 17:45:57.716693 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6\": container with ID starting with b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6 not found: ID does not exist" containerID="b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.716823 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6"} err="failed to get container status \"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6\": rpc error: code = NotFound desc = could not find container \"b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6\": container with ID starting with b74edb42a747fb6ba3b8b7edf463a4611bcf3c31a3f67bd37f1c4d68fde52ea6 not found: ID does not exist" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.716933 4841 scope.go:117] "RemoveContainer" containerID="0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c" Dec 03 17:45:57 crc kubenswrapper[4841]: E1203 17:45:57.717300 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c\": container with ID starting with 0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c not found: ID does not exist" containerID="0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.717390 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c"} err="failed to get container status \"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c\": rpc error: code = NotFound desc = could not find container \"0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c\": container with ID starting with 0e6632f2ae5988b39064161a6001abb6551aeac59d12669ca7bd09bb96b7c44c not found: ID does not exist" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.717472 4841 scope.go:117] "RemoveContainer" containerID="56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7" Dec 03 17:45:57 crc kubenswrapper[4841]: E1203 17:45:57.717820 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7\": container with ID starting with 56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7 not found: ID does not exist" containerID="56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7" Dec 03 17:45:57 crc kubenswrapper[4841]: I1203 17:45:57.717866 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7"} err="failed to get container status \"56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7\": rpc error: code = NotFound desc = could not find container \"56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7\": container with ID starting with 56074875243e86f676afdde8c5c05bdddee2bb8ae8abd3cde887cad5ddfe7fb7 not found: ID does not exist" Dec 03 17:45:57 crc kubenswrapper[4841]: E1203 17:45:57.892130 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d4d5ac_e5a6_4347_8e36_5a5d3d70f078.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34d4d5ac_e5a6_4347_8e36_5a5d3d70f078.slice/crio-dbd10742d56551bc5bb571d5f2dc7f4aa02d002da5f28495e859f7b84b9cd900\": RecentStats: unable to find data in memory cache]" Dec 03 17:45:58 crc kubenswrapper[4841]: I1203 17:45:58.249491 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" path="/var/lib/kubelet/pods/34d4d5ac-e5a6-4347-8e36-5a5d3d70f078/volumes" Dec 03 17:46:09 crc kubenswrapper[4841]: I1203 17:46:09.317084 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:46:09 crc kubenswrapper[4841]: I1203 17:46:09.317670 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:46:15 crc kubenswrapper[4841]: I1203 17:46:15.797540 4841 generic.go:334] "Generic (PLEG): container finished" podID="902cbae6-47ea-4334-8623-8148bb196870" containerID="29484c0cc05ff20227aca6eccaaac3ec672eb133de9c975c1522ff469b096a5c" exitCode=0 Dec 03 17:46:15 crc kubenswrapper[4841]: I1203 17:46:15.797623 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" event={"ID":"902cbae6-47ea-4334-8623-8148bb196870","Type":"ContainerDied","Data":"29484c0cc05ff20227aca6eccaaac3ec672eb133de9c975c1522ff469b096a5c"} Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.335719 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.487598 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.487739 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.487947 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqg5v\" (UniqueName: \"kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.488057 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.488119 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.488185 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.488262 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1\") pod \"902cbae6-47ea-4334-8623-8148bb196870\" (UID: \"902cbae6-47ea-4334-8623-8148bb196870\") " Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.497154 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v" (OuterVolumeSpecName: "kube-api-access-mqg5v") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "kube-api-access-mqg5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.499711 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.532562 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.533185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.533215 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.533746 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory" (OuterVolumeSpecName: "inventory") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.548855 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "902cbae6-47ea-4334-8623-8148bb196870" (UID: "902cbae6-47ea-4334-8623-8148bb196870"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.590781 4841 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591120 4841 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591223 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqg5v\" (UniqueName: \"kubernetes.io/projected/902cbae6-47ea-4334-8623-8148bb196870-kube-api-access-mqg5v\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591313 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591408 4841 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591496 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.591843 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/902cbae6-47ea-4334-8623-8148bb196870-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.819203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" event={"ID":"902cbae6-47ea-4334-8623-8148bb196870","Type":"ContainerDied","Data":"548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d"} Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.819257 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="548bb2bc16b5d135ddff7d72efe1e6337dddd7be0d4e449d26c73bbaf90d9b5d" Dec 03 17:46:17 crc kubenswrapper[4841]: I1203 17:46:17.819284 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-82fk8" Dec 03 17:46:39 crc kubenswrapper[4841]: I1203 17:46:39.316045 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:46:39 crc kubenswrapper[4841]: I1203 17:46:39.316530 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:46:39 crc kubenswrapper[4841]: I1203 17:46:39.316577 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:46:39 crc kubenswrapper[4841]: I1203 17:46:39.317400 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:46:39 crc kubenswrapper[4841]: I1203 17:46:39.317452 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175" gracePeriod=600 Dec 03 17:46:40 crc kubenswrapper[4841]: I1203 17:46:40.039326 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175" exitCode=0 Dec 03 17:46:40 crc kubenswrapper[4841]: I1203 17:46:40.039413 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175"} Dec 03 17:46:40 crc kubenswrapper[4841]: I1203 17:46:40.039844 4841 scope.go:117] "RemoveContainer" containerID="bf06bd08e397d3add3b7c9af1b380c091affb1a093852464ebbf0b5bc6a89487" Dec 03 17:46:41 crc kubenswrapper[4841]: I1203 17:46:41.064016 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12"} Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.195393 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200119 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="extract-content" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200141 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="extract-content" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200164 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="extract-utilities" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200174 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="extract-utilities" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200202 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200215 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200239 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200247 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200266 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="extract-utilities" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200276 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="extract-utilities" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200297 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="902cbae6-47ea-4334-8623-8148bb196870" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200306 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="902cbae6-47ea-4334-8623-8148bb196870" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 17:48:53 crc kubenswrapper[4841]: E1203 17:48:53.200335 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="extract-content" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200343 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="extract-content" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200579 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d4d5ac-e5a6-4347-8e36-5a5d3d70f078" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200606 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f8d0c3-a9e1-4062-ad8b-f7bd43e512a8" containerName="registry-server" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.200635 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="902cbae6-47ea-4334-8623-8148bb196870" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.202355 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.212186 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.297627 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtsv5\" (UniqueName: \"kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.297959 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.298027 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.399801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtsv5\" (UniqueName: \"kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.399941 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.399970 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.400542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.400720 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.435716 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtsv5\" (UniqueName: \"kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5\") pod \"redhat-marketplace-z6kvm\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:53 crc kubenswrapper[4841]: I1203 17:48:53.529927 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:48:54 crc kubenswrapper[4841]: I1203 17:48:54.640999 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:48:54 crc kubenswrapper[4841]: W1203 17:48:54.645774 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92ca4b61_d745_41ff_b8fd_9ccccb2dadda.slice/crio-8124b9776d57ef1a2d1cfd6740488e7c618ae0e91d68655daab6b643b071bebb WatchSource:0}: Error finding container 8124b9776d57ef1a2d1cfd6740488e7c618ae0e91d68655daab6b643b071bebb: Status 404 returned error can't find the container with id 8124b9776d57ef1a2d1cfd6740488e7c618ae0e91d68655daab6b643b071bebb Dec 03 17:48:55 crc kubenswrapper[4841]: I1203 17:48:55.595985 4841 generic.go:334] "Generic (PLEG): container finished" podID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerID="560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f" exitCode=0 Dec 03 17:48:55 crc kubenswrapper[4841]: I1203 17:48:55.596063 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerDied","Data":"560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f"} Dec 03 17:48:55 crc kubenswrapper[4841]: I1203 17:48:55.596392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerStarted","Data":"8124b9776d57ef1a2d1cfd6740488e7c618ae0e91d68655daab6b643b071bebb"} Dec 03 17:48:55 crc kubenswrapper[4841]: I1203 17:48:55.598877 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:48:57 crc kubenswrapper[4841]: I1203 17:48:57.623602 4841 generic.go:334] "Generic (PLEG): container finished" podID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerID="b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff" exitCode=0 Dec 03 17:48:57 crc kubenswrapper[4841]: I1203 17:48:57.623711 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerDied","Data":"b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff"} Dec 03 17:48:58 crc kubenswrapper[4841]: I1203 17:48:58.636392 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerStarted","Data":"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af"} Dec 03 17:48:58 crc kubenswrapper[4841]: I1203 17:48:58.660593 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z6kvm" podStartSLOduration=3.034194247 podStartE2EDuration="5.660572153s" podCreationTimestamp="2025-12-03 17:48:53 +0000 UTC" firstStartedPulling="2025-12-03 17:48:55.598314194 +0000 UTC m=+2929.985834961" lastFinishedPulling="2025-12-03 17:48:58.22469213 +0000 UTC m=+2932.612212867" observedRunningTime="2025-12-03 17:48:58.660255986 +0000 UTC m=+2933.047776713" watchObservedRunningTime="2025-12-03 17:48:58.660572153 +0000 UTC m=+2933.048092890" Dec 03 17:49:03 crc kubenswrapper[4841]: I1203 17:49:03.531186 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:03 crc kubenswrapper[4841]: I1203 17:49:03.532233 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:03 crc kubenswrapper[4841]: I1203 17:49:03.612581 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:03 crc kubenswrapper[4841]: I1203 17:49:03.770332 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:03 crc kubenswrapper[4841]: I1203 17:49:03.862400 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:49:05 crc kubenswrapper[4841]: I1203 17:49:05.729965 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z6kvm" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="registry-server" containerID="cri-o://8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af" gracePeriod=2 Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.263963 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.373024 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtsv5\" (UniqueName: \"kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5\") pod \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.373116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities\") pod \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.373219 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content\") pod \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\" (UID: \"92ca4b61-d745-41ff-b8fd-9ccccb2dadda\") " Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.374536 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities" (OuterVolumeSpecName: "utilities") pod "92ca4b61-d745-41ff-b8fd-9ccccb2dadda" (UID: "92ca4b61-d745-41ff-b8fd-9ccccb2dadda"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.378896 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5" (OuterVolumeSpecName: "kube-api-access-jtsv5") pod "92ca4b61-d745-41ff-b8fd-9ccccb2dadda" (UID: "92ca4b61-d745-41ff-b8fd-9ccccb2dadda"). InnerVolumeSpecName "kube-api-access-jtsv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.393582 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "92ca4b61-d745-41ff-b8fd-9ccccb2dadda" (UID: "92ca4b61-d745-41ff-b8fd-9ccccb2dadda"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.475653 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.475687 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.475702 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtsv5\" (UniqueName: \"kubernetes.io/projected/92ca4b61-d745-41ff-b8fd-9ccccb2dadda-kube-api-access-jtsv5\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.742463 4841 generic.go:334] "Generic (PLEG): container finished" podID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerID="8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af" exitCode=0 Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.742542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerDied","Data":"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af"} Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.742637 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z6kvm" event={"ID":"92ca4b61-d745-41ff-b8fd-9ccccb2dadda","Type":"ContainerDied","Data":"8124b9776d57ef1a2d1cfd6740488e7c618ae0e91d68655daab6b643b071bebb"} Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.742564 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z6kvm" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.742683 4841 scope.go:117] "RemoveContainer" containerID="8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.775944 4841 scope.go:117] "RemoveContainer" containerID="b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.797425 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.809017 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z6kvm"] Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.816382 4841 scope.go:117] "RemoveContainer" containerID="560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.866442 4841 scope.go:117] "RemoveContainer" containerID="8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af" Dec 03 17:49:06 crc kubenswrapper[4841]: E1203 17:49:06.866974 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af\": container with ID starting with 8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af not found: ID does not exist" containerID="8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.867047 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af"} err="failed to get container status \"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af\": rpc error: code = NotFound desc = could not find container \"8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af\": container with ID starting with 8d7d42b94846a9d47c5c3fc253f761e5433d739b6e27951c13f7510e3a1418af not found: ID does not exist" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.867086 4841 scope.go:117] "RemoveContainer" containerID="b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff" Dec 03 17:49:06 crc kubenswrapper[4841]: E1203 17:49:06.867444 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff\": container with ID starting with b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff not found: ID does not exist" containerID="b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.867508 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff"} err="failed to get container status \"b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff\": rpc error: code = NotFound desc = could not find container \"b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff\": container with ID starting with b7c4364c8f73b881a136b03509a5d97d68ed04aeb5c28dd1b6d12e4adb9388ff not found: ID does not exist" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.867551 4841 scope.go:117] "RemoveContainer" containerID="560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f" Dec 03 17:49:06 crc kubenswrapper[4841]: E1203 17:49:06.867880 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f\": container with ID starting with 560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f not found: ID does not exist" containerID="560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f" Dec 03 17:49:06 crc kubenswrapper[4841]: I1203 17:49:06.867983 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f"} err="failed to get container status \"560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f\": rpc error: code = NotFound desc = could not find container \"560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f\": container with ID starting with 560aa632f6fb68c2cf5f4dbbf96b924740c66bd79af9d3d779ec3f39e1a4ad9f not found: ID does not exist" Dec 03 17:49:08 crc kubenswrapper[4841]: I1203 17:49:08.250973 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" path="/var/lib/kubelet/pods/92ca4b61-d745-41ff-b8fd-9ccccb2dadda/volumes" Dec 03 17:49:09 crc kubenswrapper[4841]: I1203 17:49:09.316594 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:49:09 crc kubenswrapper[4841]: I1203 17:49:09.316670 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:49:12 crc kubenswrapper[4841]: I1203 17:49:12.451581 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.295693 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.296306 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" containerName="openstackclient" containerID="cri-o://6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082" gracePeriod=2 Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.315470 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.331639 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 17:49:14 crc kubenswrapper[4841]: E1203 17:49:14.332161 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" containerName="openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332179 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" containerName="openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: E1203 17:49:14.332187 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="registry-server" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332193 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="registry-server" Dec 03 17:49:14 crc kubenswrapper[4841]: E1203 17:49:14.332227 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="extract-content" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332233 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="extract-content" Dec 03 17:49:14 crc kubenswrapper[4841]: E1203 17:49:14.332261 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="extract-utilities" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332266 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="extract-utilities" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332439 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ca4b61-d745-41ff-b8fd-9ccccb2dadda" containerName="registry-server" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.332456 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" containerName="openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.333107 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.341816 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.355863 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.364450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.364527 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llbcf\" (UniqueName: \"kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.364556 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.364757 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.466985 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.467069 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.467145 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llbcf\" (UniqueName: \"kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.467197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.469183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.476823 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.481551 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.488448 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llbcf\" (UniqueName: \"kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf\") pod \"openstackclient\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " pod="openstack/openstackclient" Dec 03 17:49:14 crc kubenswrapper[4841]: I1203 17:49:14.664010 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.015359 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:49:15 crc kubenswrapper[4841]: W1203 17:49:15.018815 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9e534e4_4bcc_40f6_b7f3_de879530ebf6.slice/crio-907b5e1ebe9d857996e5b110022c5d35112fb529b6dfea81bba9a29a312603db WatchSource:0}: Error finding container 907b5e1ebe9d857996e5b110022c5d35112fb529b6dfea81bba9a29a312603db: Status 404 returned error can't find the container with id 907b5e1ebe9d857996e5b110022c5d35112fb529b6dfea81bba9a29a312603db Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.589773 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-h97vq"] Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.591607 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.601871 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-h97vq"] Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.682425 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-64fc-account-create-update-brbfq"] Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.683862 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.689383 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.691378 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z486n\" (UniqueName: \"kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.691531 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.698701 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-64fc-account-create-update-brbfq"] Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.793778 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbdsq\" (UniqueName: \"kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.793955 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z486n\" (UniqueName: \"kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.794087 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.794128 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.794925 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.814898 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z486n\" (UniqueName: \"kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n\") pod \"aodh-db-create-h97vq\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.855564 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9e534e4-4bcc-40f6-b7f3-de879530ebf6","Type":"ContainerStarted","Data":"cf3231110f26c90878f3e6247ea3f5f1ea144fbe936ee3be5855e219d0546edb"} Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.855616 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"a9e534e4-4bcc-40f6-b7f3-de879530ebf6","Type":"ContainerStarted","Data":"907b5e1ebe9d857996e5b110022c5d35112fb529b6dfea81bba9a29a312603db"} Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.879689 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.879665257 podStartE2EDuration="1.879665257s" podCreationTimestamp="2025-12-03 17:49:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:49:15.878432907 +0000 UTC m=+2950.265953634" watchObservedRunningTime="2025-12-03 17:49:15.879665257 +0000 UTC m=+2950.267185994" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.895603 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.895797 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbdsq\" (UniqueName: \"kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.896288 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.913511 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbdsq\" (UniqueName: \"kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq\") pod \"aodh-64fc-account-create-update-brbfq\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:15 crc kubenswrapper[4841]: I1203 17:49:15.931594 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.003038 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.392704 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-h97vq"] Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.488613 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-64fc-account-create-update-brbfq"] Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.676324 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.679764 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.822924 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgng\" (UniqueName: \"kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng\") pod \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.823056 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle\") pod \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.823163 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret\") pod \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.823214 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config\") pod \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\" (UID: \"967e0e4b-7b01-435c-92b7-dedc9b63dc5c\") " Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.830097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng" (OuterVolumeSpecName: "kube-api-access-vkgng") pod "967e0e4b-7b01-435c-92b7-dedc9b63dc5c" (UID: "967e0e4b-7b01-435c-92b7-dedc9b63dc5c"). InnerVolumeSpecName "kube-api-access-vkgng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.861127 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "967e0e4b-7b01-435c-92b7-dedc9b63dc5c" (UID: "967e0e4b-7b01-435c-92b7-dedc9b63dc5c"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.866343 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "967e0e4b-7b01-435c-92b7-dedc9b63dc5c" (UID: "967e0e4b-7b01-435c-92b7-dedc9b63dc5c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.869594 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-64fc-account-create-update-brbfq" event={"ID":"a95d12fa-977b-4ac5-ba27-5c17345449e0","Type":"ContainerStarted","Data":"920957badd9afe617bdc300abbb29b9950c42bfd57dc5242c8b38fd50110c097"} Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.869744 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-64fc-account-create-update-brbfq" event={"ID":"a95d12fa-977b-4ac5-ba27-5c17345449e0","Type":"ContainerStarted","Data":"e203bafe781e25db5544a9c6a3e7c2585fe762323a64375a5d9d421039dd0116"} Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.876068 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h97vq" event={"ID":"dd67b927-0545-4a8a-a3e7-ac428b92ee76","Type":"ContainerStarted","Data":"a6e04772d785ec2a2f55c1fd40906bda2825ca2dc3c75e7a90034c978bf12f9e"} Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.876119 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h97vq" event={"ID":"dd67b927-0545-4a8a-a3e7-ac428b92ee76","Type":"ContainerStarted","Data":"4658a4f2ee68fb7e3d5600d83360b7da98b6d56ac7cfcbc41d461fb048fdcddc"} Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.885962 4841 generic.go:334] "Generic (PLEG): container finished" podID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" containerID="6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082" exitCode=137 Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.886048 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.886345 4841 scope.go:117] "RemoveContainer" containerID="6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.895570 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "967e0e4b-7b01-435c-92b7-dedc9b63dc5c" (UID: "967e0e4b-7b01-435c-92b7-dedc9b63dc5c"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.897704 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-64fc-account-create-update-brbfq" podStartSLOduration=1.8974130649999998 podStartE2EDuration="1.897413065s" podCreationTimestamp="2025-12-03 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:49:16.895092658 +0000 UTC m=+2951.282613385" watchObservedRunningTime="2025-12-03 17:49:16.897413065 +0000 UTC m=+2951.284933792" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.898475 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.918961 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-h97vq" podStartSLOduration=1.918941638 podStartE2EDuration="1.918941638s" podCreationTimestamp="2025-12-03 17:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:49:16.911117764 +0000 UTC m=+2951.298638491" watchObservedRunningTime="2025-12-03 17:49:16.918941638 +0000 UTC m=+2951.306462365" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.925378 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.925504 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.925565 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:16 crc kubenswrapper[4841]: I1203 17:49:16.925617 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgng\" (UniqueName: \"kubernetes.io/projected/967e0e4b-7b01-435c-92b7-dedc9b63dc5c-kube-api-access-vkgng\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.005510 4841 scope.go:117] "RemoveContainer" containerID="6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082" Dec 03 17:49:17 crc kubenswrapper[4841]: E1203 17:49:17.005972 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082\": container with ID starting with 6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082 not found: ID does not exist" containerID="6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082" Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.006021 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082"} err="failed to get container status \"6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082\": rpc error: code = NotFound desc = could not find container \"6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082\": container with ID starting with 6953dc8d7d92a8c50ccba3198699bf9e74f1eb50d6f87e70e75280c24dec0082 not found: ID does not exist" Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.210144 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.900847 4841 generic.go:334] "Generic (PLEG): container finished" podID="a95d12fa-977b-4ac5-ba27-5c17345449e0" containerID="920957badd9afe617bdc300abbb29b9950c42bfd57dc5242c8b38fd50110c097" exitCode=0 Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.901001 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-64fc-account-create-update-brbfq" event={"ID":"a95d12fa-977b-4ac5-ba27-5c17345449e0","Type":"ContainerDied","Data":"920957badd9afe617bdc300abbb29b9950c42bfd57dc5242c8b38fd50110c097"} Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.903832 4841 generic.go:334] "Generic (PLEG): container finished" podID="dd67b927-0545-4a8a-a3e7-ac428b92ee76" containerID="a6e04772d785ec2a2f55c1fd40906bda2825ca2dc3c75e7a90034c978bf12f9e" exitCode=0 Dec 03 17:49:17 crc kubenswrapper[4841]: I1203 17:49:17.903935 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h97vq" event={"ID":"dd67b927-0545-4a8a-a3e7-ac428b92ee76","Type":"ContainerDied","Data":"a6e04772d785ec2a2f55c1fd40906bda2825ca2dc3c75e7a90034c978bf12f9e"} Dec 03 17:49:18 crc kubenswrapper[4841]: I1203 17:49:18.250995 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="967e0e4b-7b01-435c-92b7-dedc9b63dc5c" path="/var/lib/kubelet/pods/967e0e4b-7b01-435c-92b7-dedc9b63dc5c/volumes" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.396654 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.403467 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.584170 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z486n\" (UniqueName: \"kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n\") pod \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.584608 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts\") pod \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\" (UID: \"dd67b927-0545-4a8a-a3e7-ac428b92ee76\") " Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.584722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts\") pod \"a95d12fa-977b-4ac5-ba27-5c17345449e0\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.585323 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dd67b927-0545-4a8a-a3e7-ac428b92ee76" (UID: "dd67b927-0545-4a8a-a3e7-ac428b92ee76"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.585343 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a95d12fa-977b-4ac5-ba27-5c17345449e0" (UID: "a95d12fa-977b-4ac5-ba27-5c17345449e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.585470 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbdsq\" (UniqueName: \"kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq\") pod \"a95d12fa-977b-4ac5-ba27-5c17345449e0\" (UID: \"a95d12fa-977b-4ac5-ba27-5c17345449e0\") " Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.586071 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dd67b927-0545-4a8a-a3e7-ac428b92ee76-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.586098 4841 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a95d12fa-977b-4ac5-ba27-5c17345449e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.591032 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n" (OuterVolumeSpecName: "kube-api-access-z486n") pod "dd67b927-0545-4a8a-a3e7-ac428b92ee76" (UID: "dd67b927-0545-4a8a-a3e7-ac428b92ee76"). InnerVolumeSpecName "kube-api-access-z486n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.591122 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq" (OuterVolumeSpecName: "kube-api-access-zbdsq") pod "a95d12fa-977b-4ac5-ba27-5c17345449e0" (UID: "a95d12fa-977b-4ac5-ba27-5c17345449e0"). InnerVolumeSpecName "kube-api-access-zbdsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.688469 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z486n\" (UniqueName: \"kubernetes.io/projected/dd67b927-0545-4a8a-a3e7-ac428b92ee76-kube-api-access-z486n\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.688537 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbdsq\" (UniqueName: \"kubernetes.io/projected/a95d12fa-977b-4ac5-ba27-5c17345449e0-kube-api-access-zbdsq\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.935385 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-h97vq" event={"ID":"dd67b927-0545-4a8a-a3e7-ac428b92ee76","Type":"ContainerDied","Data":"4658a4f2ee68fb7e3d5600d83360b7da98b6d56ac7cfcbc41d461fb048fdcddc"} Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.935451 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4658a4f2ee68fb7e3d5600d83360b7da98b6d56ac7cfcbc41d461fb048fdcddc" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.935505 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-h97vq" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.937093 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-64fc-account-create-update-brbfq" event={"ID":"a95d12fa-977b-4ac5-ba27-5c17345449e0","Type":"ContainerDied","Data":"e203bafe781e25db5544a9c6a3e7c2585fe762323a64375a5d9d421039dd0116"} Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.937137 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e203bafe781e25db5544a9c6a3e7c2585fe762323a64375a5d9d421039dd0116" Dec 03 17:49:19 crc kubenswrapper[4841]: I1203 17:49:19.937195 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-64fc-account-create-update-brbfq" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.069733 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-29tn9"] Dec 03 17:49:21 crc kubenswrapper[4841]: E1203 17:49:21.070528 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a95d12fa-977b-4ac5-ba27-5c17345449e0" containerName="mariadb-account-create-update" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.070544 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a95d12fa-977b-4ac5-ba27-5c17345449e0" containerName="mariadb-account-create-update" Dec 03 17:49:21 crc kubenswrapper[4841]: E1203 17:49:21.070575 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd67b927-0545-4a8a-a3e7-ac428b92ee76" containerName="mariadb-database-create" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.070595 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd67b927-0545-4a8a-a3e7-ac428b92ee76" containerName="mariadb-database-create" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.070863 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd67b927-0545-4a8a-a3e7-ac428b92ee76" containerName="mariadb-database-create" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.070881 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a95d12fa-977b-4ac5-ba27-5c17345449e0" containerName="mariadb-account-create-update" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.071793 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.075547 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.077115 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.078682 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.079444 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-29tn9"] Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.079695 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.122502 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.122589 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.122698 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.122741 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw5rb\" (UniqueName: \"kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.224986 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.225363 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.225535 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.225640 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw5rb\" (UniqueName: \"kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.229031 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.233141 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.236254 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.246509 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw5rb\" (UniqueName: \"kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb\") pod \"aodh-db-sync-29tn9\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.398815 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.708828 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-29tn9"] Dec 03 17:49:21 crc kubenswrapper[4841]: I1203 17:49:21.962233 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-29tn9" event={"ID":"2437c8d9-3720-48c6-ad74-252479515189","Type":"ContainerStarted","Data":"d4f9ef02c7912b59423cdc7dc7b9acdb57da81d99811f9d9b8f6d69a5c5e1204"} Dec 03 17:49:26 crc kubenswrapper[4841]: I1203 17:49:26.013380 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-29tn9" event={"ID":"2437c8d9-3720-48c6-ad74-252479515189","Type":"ContainerStarted","Data":"3f24b38f44e89dfddf7a8a2544c0c693096c9ebb32ea34ba65f29e96d204b920"} Dec 03 17:49:26 crc kubenswrapper[4841]: I1203 17:49:26.046326 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-29tn9" podStartSLOduration=1.2704936359999999 podStartE2EDuration="5.04630273s" podCreationTimestamp="2025-12-03 17:49:21 +0000 UTC" firstStartedPulling="2025-12-03 17:49:21.713638689 +0000 UTC m=+2956.101159416" lastFinishedPulling="2025-12-03 17:49:25.489447753 +0000 UTC m=+2959.876968510" observedRunningTime="2025-12-03 17:49:26.037185074 +0000 UTC m=+2960.424705811" watchObservedRunningTime="2025-12-03 17:49:26.04630273 +0000 UTC m=+2960.433823467" Dec 03 17:49:28 crc kubenswrapper[4841]: I1203 17:49:28.042079 4841 generic.go:334] "Generic (PLEG): container finished" podID="2437c8d9-3720-48c6-ad74-252479515189" containerID="3f24b38f44e89dfddf7a8a2544c0c693096c9ebb32ea34ba65f29e96d204b920" exitCode=0 Dec 03 17:49:28 crc kubenswrapper[4841]: I1203 17:49:28.042223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-29tn9" event={"ID":"2437c8d9-3720-48c6-ad74-252479515189","Type":"ContainerDied","Data":"3f24b38f44e89dfddf7a8a2544c0c693096c9ebb32ea34ba65f29e96d204b920"} Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.382017 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.411695 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle\") pod \"2437c8d9-3720-48c6-ad74-252479515189\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.412230 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw5rb\" (UniqueName: \"kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb\") pod \"2437c8d9-3720-48c6-ad74-252479515189\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.412541 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data\") pod \"2437c8d9-3720-48c6-ad74-252479515189\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.413240 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts\") pod \"2437c8d9-3720-48c6-ad74-252479515189\" (UID: \"2437c8d9-3720-48c6-ad74-252479515189\") " Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.426016 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb" (OuterVolumeSpecName: "kube-api-access-lw5rb") pod "2437c8d9-3720-48c6-ad74-252479515189" (UID: "2437c8d9-3720-48c6-ad74-252479515189"). InnerVolumeSpecName "kube-api-access-lw5rb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.435436 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts" (OuterVolumeSpecName: "scripts") pod "2437c8d9-3720-48c6-ad74-252479515189" (UID: "2437c8d9-3720-48c6-ad74-252479515189"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.448813 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2437c8d9-3720-48c6-ad74-252479515189" (UID: "2437c8d9-3720-48c6-ad74-252479515189"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.454467 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data" (OuterVolumeSpecName: "config-data") pod "2437c8d9-3720-48c6-ad74-252479515189" (UID: "2437c8d9-3720-48c6-ad74-252479515189"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.515766 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw5rb\" (UniqueName: \"kubernetes.io/projected/2437c8d9-3720-48c6-ad74-252479515189-kube-api-access-lw5rb\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.516011 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.516022 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:29 crc kubenswrapper[4841]: I1203 17:49:29.516032 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2437c8d9-3720-48c6-ad74-252479515189-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.065395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-29tn9" event={"ID":"2437c8d9-3720-48c6-ad74-252479515189","Type":"ContainerDied","Data":"d4f9ef02c7912b59423cdc7dc7b9acdb57da81d99811f9d9b8f6d69a5c5e1204"} Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.065677 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4f9ef02c7912b59423cdc7dc7b9acdb57da81d99811f9d9b8f6d69a5c5e1204" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.065453 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-29tn9" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.679300 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 17:49:30 crc kubenswrapper[4841]: E1203 17:49:30.680097 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2437c8d9-3720-48c6-ad74-252479515189" containerName="aodh-db-sync" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.680136 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="2437c8d9-3720-48c6-ad74-252479515189" containerName="aodh-db-sync" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.680459 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="2437c8d9-3720-48c6-ad74-252479515189" containerName="aodh-db-sync" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.682987 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.685278 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.685340 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.685749 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.692625 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.743176 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldmxc\" (UniqueName: \"kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.743245 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.743368 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.743495 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.844785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.844998 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ldmxc\" (UniqueName: \"kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.845041 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.845090 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.870673 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.872183 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.872990 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:30 crc kubenswrapper[4841]: I1203 17:49:30.875484 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldmxc\" (UniqueName: \"kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc\") pod \"aodh-0\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " pod="openstack/aodh-0" Dec 03 17:49:31 crc kubenswrapper[4841]: I1203 17:49:31.008014 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:49:31 crc kubenswrapper[4841]: I1203 17:49:31.544205 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:49:31 crc kubenswrapper[4841]: W1203 17:49:31.546712 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ba640a0_7ad9_4deb_b5fe_e017d05aa746.slice/crio-0e2edcee23f6b73eef8351c629431ae0a6079ffac682120209035be7433a1d07 WatchSource:0}: Error finding container 0e2edcee23f6b73eef8351c629431ae0a6079ffac682120209035be7433a1d07: Status 404 returned error can't find the container with id 0e2edcee23f6b73eef8351c629431ae0a6079ffac682120209035be7433a1d07 Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.086407 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerStarted","Data":"0e2edcee23f6b73eef8351c629431ae0a6079ffac682120209035be7433a1d07"} Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.638686 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.639507 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-central-agent" containerID="cri-o://8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f" gracePeriod=30 Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.639635 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-notification-agent" containerID="cri-o://1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2" gracePeriod=30 Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.639652 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="sg-core" containerID="cri-o://8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a" gracePeriod=30 Dec 03 17:49:32 crc kubenswrapper[4841]: I1203 17:49:32.639583 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="proxy-httpd" containerID="cri-o://5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817" gracePeriod=30 Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.102752 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerID="5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817" exitCode=0 Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.103101 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerID="8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a" exitCode=2 Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.102916 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerDied","Data":"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817"} Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.103177 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerDied","Data":"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a"} Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.104803 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerStarted","Data":"f091997077ea3a12a6d4125e73e9ec27fdf9e9a730cc5ce315290e7a08ca21f2"} Dec 03 17:49:33 crc kubenswrapper[4841]: I1203 17:49:33.809500 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:49:34 crc kubenswrapper[4841]: I1203 17:49:34.123786 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerID="8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f" exitCode=0 Dec 03 17:49:34 crc kubenswrapper[4841]: I1203 17:49:34.123852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerDied","Data":"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f"} Dec 03 17:49:35 crc kubenswrapper[4841]: I1203 17:49:35.136408 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerStarted","Data":"2dac037d4490b24006e8900d232e3c9423796327e9b144e48cf06d33fd69b5b7"} Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.194141 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerStarted","Data":"e81de06a99593194ae336167a4246a396adf696fea691d6f35c7ab255a897784"} Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.521323 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655422 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655606 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655645 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655752 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fzrg\" (UniqueName: \"kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655787 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655842 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655876 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.655900 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml\") pod \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\" (UID: \"8fd024bf-ffe0-4569-ab22-9c6ddecb1431\") " Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.656071 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.656372 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.656422 4841 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.666262 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts" (OuterVolumeSpecName: "scripts") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.666332 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg" (OuterVolumeSpecName: "kube-api-access-6fzrg") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "kube-api-access-6fzrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.687641 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.706624 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.758710 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fzrg\" (UniqueName: \"kubernetes.io/projected/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-kube-api-access-6fzrg\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.758751 4841 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.758765 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.758785 4841 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.758796 4841 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.778957 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data" (OuterVolumeSpecName: "config-data") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.787111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8fd024bf-ffe0-4569-ab22-9c6ddecb1431" (UID: "8fd024bf-ffe0-4569-ab22-9c6ddecb1431"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.861295 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:36 crc kubenswrapper[4841]: I1203 17:49:36.861330 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8fd024bf-ffe0-4569-ab22-9c6ddecb1431-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.205414 4841 generic.go:334] "Generic (PLEG): container finished" podID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerID="1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2" exitCode=0 Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.205534 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.205542 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerDied","Data":"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2"} Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.205788 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"8fd024bf-ffe0-4569-ab22-9c6ddecb1431","Type":"ContainerDied","Data":"c3f086d46ccf36ba96e648b5469cabf2af85bc86b3dd3f9a3f83fe1dc3abaf10"} Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.205821 4841 scope.go:117] "RemoveContainer" containerID="5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.231186 4841 scope.go:117] "RemoveContainer" containerID="8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.270290 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.280787 4841 scope.go:117] "RemoveContainer" containerID="1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.286168 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293206 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.293621 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="proxy-httpd" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293639 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="proxy-httpd" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.293682 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="sg-core" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293689 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="sg-core" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.293706 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-central-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293712 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-central-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.293720 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-notification-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293726 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-notification-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293922 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-central-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293942 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="sg-core" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293955 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="proxy-httpd" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.293972 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" containerName="ceilometer-notification-agent" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.295945 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.297795 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.298412 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.298754 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.300899 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.304276 4841 scope.go:117] "RemoveContainer" containerID="8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.339058 4841 scope.go:117] "RemoveContainer" containerID="5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.339576 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817\": container with ID starting with 5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817 not found: ID does not exist" containerID="5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.339670 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817"} err="failed to get container status \"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817\": rpc error: code = NotFound desc = could not find container \"5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817\": container with ID starting with 5d5998e77e6c5060d97c3dd8d4617903d47254255b90bbb8edc630b273ddf817 not found: ID does not exist" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.339751 4841 scope.go:117] "RemoveContainer" containerID="8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.340174 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a\": container with ID starting with 8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a not found: ID does not exist" containerID="8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.340198 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a"} err="failed to get container status \"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a\": rpc error: code = NotFound desc = could not find container \"8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a\": container with ID starting with 8ce9252b850a6f600588a35f7be6aeb8e2e0e9e49e15aa3e4b79b036d8dfac7a not found: ID does not exist" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.340217 4841 scope.go:117] "RemoveContainer" containerID="1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.340606 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2\": container with ID starting with 1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2 not found: ID does not exist" containerID="1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.340683 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2"} err="failed to get container status \"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2\": rpc error: code = NotFound desc = could not find container \"1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2\": container with ID starting with 1c48f702559472fbd40e75c7fcea231af1e6bceaf25b0f3ce3a3c4363c7f51b2 not found: ID does not exist" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.340714 4841 scope.go:117] "RemoveContainer" containerID="8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f" Dec 03 17:49:37 crc kubenswrapper[4841]: E1203 17:49:37.344618 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f\": container with ID starting with 8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f not found: ID does not exist" containerID="8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.344657 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f"} err="failed to get container status \"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f\": rpc error: code = NotFound desc = could not find container \"8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f\": container with ID starting with 8b8f71a124f0be8d6ebdc2bbfae84524abb8e126010593ffa2b3c09c78a8d18f not found: ID does not exist" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374290 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374343 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4cx\" (UniqueName: \"kubernetes.io/projected/0610d33e-9635-49f9-a9db-f2bbac336470-kube-api-access-xg4cx\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374371 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374400 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-run-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374415 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374431 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-scripts\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374474 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-config-data\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.374497 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-log-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477636 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477766 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg4cx\" (UniqueName: \"kubernetes.io/projected/0610d33e-9635-49f9-a9db-f2bbac336470-kube-api-access-xg4cx\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477826 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477889 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-run-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477951 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.477994 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-scripts\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.478089 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-config-data\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.478143 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-log-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.478603 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-run-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.479022 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0610d33e-9635-49f9-a9db-f2bbac336470-log-httpd\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.484684 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-scripts\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.484682 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.485391 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.487728 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-config-data\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.493685 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/0610d33e-9635-49f9-a9db-f2bbac336470-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.497356 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg4cx\" (UniqueName: \"kubernetes.io/projected/0610d33e-9635-49f9-a9db-f2bbac336470-kube-api-access-xg4cx\") pod \"ceilometer-0\" (UID: \"0610d33e-9635-49f9-a9db-f2bbac336470\") " pod="openstack/ceilometer-0" Dec 03 17:49:37 crc kubenswrapper[4841]: I1203 17:49:37.667157 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.141127 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 17:49:38 crc kubenswrapper[4841]: W1203 17:49:38.145028 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0610d33e_9635_49f9_a9db_f2bbac336470.slice/crio-f3007501c535f59a0e32c2a64a1c921e5b320b70393d6e5b3ea65b9c055baac3 WatchSource:0}: Error finding container f3007501c535f59a0e32c2a64a1c921e5b320b70393d6e5b3ea65b9c055baac3: Status 404 returned error can't find the container with id f3007501c535f59a0e32c2a64a1c921e5b320b70393d6e5b3ea65b9c055baac3 Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.216451 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerStarted","Data":"657caf57becaaa0fbbfbc9aa20952e5bfc1c9066f183f876977ce09506fd608a"} Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.217478 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-api" containerID="cri-o://f091997077ea3a12a6d4125e73e9ec27fdf9e9a730cc5ce315290e7a08ca21f2" gracePeriod=30 Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.219163 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-listener" containerID="cri-o://657caf57becaaa0fbbfbc9aa20952e5bfc1c9066f183f876977ce09506fd608a" gracePeriod=30 Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.219391 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-notifier" containerID="cri-o://e81de06a99593194ae336167a4246a396adf696fea691d6f35c7ab255a897784" gracePeriod=30 Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.219528 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-evaluator" containerID="cri-o://2dac037d4490b24006e8900d232e3c9423796327e9b144e48cf06d33fd69b5b7" gracePeriod=30 Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.221889 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0610d33e-9635-49f9-a9db-f2bbac336470","Type":"ContainerStarted","Data":"f3007501c535f59a0e32c2a64a1c921e5b320b70393d6e5b3ea65b9c055baac3"} Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.257856 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.663250752 podStartE2EDuration="8.257796782s" podCreationTimestamp="2025-12-03 17:49:30 +0000 UTC" firstStartedPulling="2025-12-03 17:49:31.549392626 +0000 UTC m=+2965.936913353" lastFinishedPulling="2025-12-03 17:49:37.143938626 +0000 UTC m=+2971.531459383" observedRunningTime="2025-12-03 17:49:38.249388434 +0000 UTC m=+2972.636909161" watchObservedRunningTime="2025-12-03 17:49:38.257796782 +0000 UTC m=+2972.645317519" Dec 03 17:49:38 crc kubenswrapper[4841]: I1203 17:49:38.259537 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fd024bf-ffe0-4569-ab22-9c6ddecb1431" path="/var/lib/kubelet/pods/8fd024bf-ffe0-4569-ab22-9c6ddecb1431/volumes" Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.232225 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0610d33e-9635-49f9-a9db-f2bbac336470","Type":"ContainerStarted","Data":"e5690e8e351688cb8187dd0ec7ff106ec137fac5cd64b88e0df971934d020045"} Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235772 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerID="e81de06a99593194ae336167a4246a396adf696fea691d6f35c7ab255a897784" exitCode=0 Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235801 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerID="2dac037d4490b24006e8900d232e3c9423796327e9b144e48cf06d33fd69b5b7" exitCode=0 Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235814 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerID="f091997077ea3a12a6d4125e73e9ec27fdf9e9a730cc5ce315290e7a08ca21f2" exitCode=0 Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235813 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerDied","Data":"e81de06a99593194ae336167a4246a396adf696fea691d6f35c7ab255a897784"} Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235841 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerDied","Data":"2dac037d4490b24006e8900d232e3c9423796327e9b144e48cf06d33fd69b5b7"} Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.235850 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerDied","Data":"f091997077ea3a12a6d4125e73e9ec27fdf9e9a730cc5ce315290e7a08ca21f2"} Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.317689 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:49:39 crc kubenswrapper[4841]: I1203 17:49:39.317738 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:49:40 crc kubenswrapper[4841]: I1203 17:49:40.258054 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0610d33e-9635-49f9-a9db-f2bbac336470","Type":"ContainerStarted","Data":"07b746b3cba04ab02949a14b9e0c97145a29df35ee953b6a35225d5301aea8dd"} Dec 03 17:49:40 crc kubenswrapper[4841]: I1203 17:49:40.258405 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0610d33e-9635-49f9-a9db-f2bbac336470","Type":"ContainerStarted","Data":"71a9d234fffb29582bc257aa89e806432ffe0b180398530a26f3479dea179880"} Dec 03 17:49:42 crc kubenswrapper[4841]: I1203 17:49:42.283481 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0610d33e-9635-49f9-a9db-f2bbac336470","Type":"ContainerStarted","Data":"e44a298194e014c2c4fa057fa24ee29358bdeb138fac4f8a24310a5a4f27c61e"} Dec 03 17:49:42 crc kubenswrapper[4841]: I1203 17:49:42.284327 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 17:49:42 crc kubenswrapper[4841]: I1203 17:49:42.331720 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.960380434 podStartE2EDuration="5.33169339s" podCreationTimestamp="2025-12-03 17:49:37 +0000 UTC" firstStartedPulling="2025-12-03 17:49:38.147841042 +0000 UTC m=+2972.535361779" lastFinishedPulling="2025-12-03 17:49:41.519154008 +0000 UTC m=+2975.906674735" observedRunningTime="2025-12-03 17:49:42.316822072 +0000 UTC m=+2976.704342829" watchObservedRunningTime="2025-12-03 17:49:42.33169339 +0000 UTC m=+2976.719214147" Dec 03 17:50:07 crc kubenswrapper[4841]: I1203 17:50:07.681685 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 17:50:08 crc kubenswrapper[4841]: I1203 17:50:08.762230 4841 generic.go:334] "Generic (PLEG): container finished" podID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerID="657caf57becaaa0fbbfbc9aa20952e5bfc1c9066f183f876977ce09506fd608a" exitCode=137 Dec 03 17:50:08 crc kubenswrapper[4841]: I1203 17:50:08.762287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerDied","Data":"657caf57becaaa0fbbfbc9aa20952e5bfc1c9066f183f876977ce09506fd608a"} Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.093033 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.224292 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldmxc\" (UniqueName: \"kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc\") pod \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.224520 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data\") pod \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.224547 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts\") pod \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.224621 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle\") pod \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\" (UID: \"9ba640a0-7ad9-4deb-b5fe-e017d05aa746\") " Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.230970 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc" (OuterVolumeSpecName: "kube-api-access-ldmxc") pod "9ba640a0-7ad9-4deb-b5fe-e017d05aa746" (UID: "9ba640a0-7ad9-4deb-b5fe-e017d05aa746"). InnerVolumeSpecName "kube-api-access-ldmxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.239663 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts" (OuterVolumeSpecName: "scripts") pod "9ba640a0-7ad9-4deb-b5fe-e017d05aa746" (UID: "9ba640a0-7ad9-4deb-b5fe-e017d05aa746"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.316662 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.316718 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.316771 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.317609 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.317692 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" gracePeriod=600 Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.326845 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ldmxc\" (UniqueName: \"kubernetes.io/projected/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-kube-api-access-ldmxc\") on node \"crc\" DevicePath \"\"" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.326875 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.369750 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ba640a0-7ad9-4deb-b5fe-e017d05aa746" (UID: "9ba640a0-7ad9-4deb-b5fe-e017d05aa746"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.380344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data" (OuterVolumeSpecName: "config-data") pod "9ba640a0-7ad9-4deb-b5fe-e017d05aa746" (UID: "9ba640a0-7ad9-4deb-b5fe-e017d05aa746"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.429842 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.429998 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ba640a0-7ad9-4deb-b5fe-e017d05aa746-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.440081 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.776147 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" exitCode=0 Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.776228 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12"} Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.776266 4841 scope.go:117] "RemoveContainer" containerID="fad0ee6f64d75b8a1d0900eaf1d5ad36ded88ec55f6c520431d0f06e564f2175" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.776960 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.777259 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.783440 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"9ba640a0-7ad9-4deb-b5fe-e017d05aa746","Type":"ContainerDied","Data":"0e2edcee23f6b73eef8351c629431ae0a6079ffac682120209035be7433a1d07"} Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.783526 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.864386 4841 scope.go:117] "RemoveContainer" containerID="657caf57becaaa0fbbfbc9aa20952e5bfc1c9066f183f876977ce09506fd608a" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.907301 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.920031 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.929757 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.930188 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-evaluator" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930200 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-evaluator" Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.930225 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-api" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930234 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-api" Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.930258 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-notifier" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930264 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-notifier" Dec 03 17:50:09 crc kubenswrapper[4841]: E1203 17:50:09.930274 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-listener" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930279 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-listener" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930440 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-listener" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930454 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-api" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930472 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-notifier" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.930483 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" containerName="aodh-evaluator" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.932398 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.935404 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.935590 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.935719 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.935925 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.936035 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.946092 4841 scope.go:117] "RemoveContainer" containerID="e81de06a99593194ae336167a4246a396adf696fea691d6f35c7ab255a897784" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.952616 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971222 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhht\" (UniqueName: \"kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971340 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971366 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971395 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971423 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.971452 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:09 crc kubenswrapper[4841]: I1203 17:50:09.982397 4841 scope.go:117] "RemoveContainer" containerID="2dac037d4490b24006e8900d232e3c9423796327e9b144e48cf06d33fd69b5b7" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.000524 4841 scope.go:117] "RemoveContainer" containerID="f091997077ea3a12a6d4125e73e9ec27fdf9e9a730cc5ce315290e7a08ca21f2" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074424 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074478 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074536 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.074698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhht\" (UniqueName: \"kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.078844 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.079561 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.080311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.087481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.092261 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhht\" (UniqueName: \"kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.094380 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data\") pod \"aodh-0\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.255936 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.258659 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ba640a0-7ad9-4deb-b5fe-e017d05aa746" path="/var/lib/kubelet/pods/9ba640a0-7ad9-4deb-b5fe-e017d05aa746/volumes" Dec 03 17:50:10 crc kubenswrapper[4841]: W1203 17:50:10.715286 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc8a6ecd_e534_4e5f_af9d_93962acc5642.slice/crio-75fd6422703ca21b13a50832dcf69d1285f6189cfa489e9444cb3482ea5ab219 WatchSource:0}: Error finding container 75fd6422703ca21b13a50832dcf69d1285f6189cfa489e9444cb3482ea5ab219: Status 404 returned error can't find the container with id 75fd6422703ca21b13a50832dcf69d1285f6189cfa489e9444cb3482ea5ab219 Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.716626 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:50:10 crc kubenswrapper[4841]: I1203 17:50:10.807428 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerStarted","Data":"75fd6422703ca21b13a50832dcf69d1285f6189cfa489e9444cb3482ea5ab219"} Dec 03 17:50:11 crc kubenswrapper[4841]: I1203 17:50:11.819360 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerStarted","Data":"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98"} Dec 03 17:50:12 crc kubenswrapper[4841]: I1203 17:50:12.836504 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerStarted","Data":"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae"} Dec 03 17:50:13 crc kubenswrapper[4841]: I1203 17:50:13.855019 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerStarted","Data":"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec"} Dec 03 17:50:13 crc kubenswrapper[4841]: I1203 17:50:13.855455 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerStarted","Data":"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287"} Dec 03 17:50:13 crc kubenswrapper[4841]: I1203 17:50:13.904443 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.299530906 podStartE2EDuration="4.904422241s" podCreationTimestamp="2025-12-03 17:50:09 +0000 UTC" firstStartedPulling="2025-12-03 17:50:10.718718706 +0000 UTC m=+3005.106239443" lastFinishedPulling="2025-12-03 17:50:13.323610011 +0000 UTC m=+3007.711130778" observedRunningTime="2025-12-03 17:50:13.89549383 +0000 UTC m=+3008.283014557" watchObservedRunningTime="2025-12-03 17:50:13.904422241 +0000 UTC m=+3008.291942968" Dec 03 17:50:21 crc kubenswrapper[4841]: I1203 17:50:21.239762 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:50:21 crc kubenswrapper[4841]: E1203 17:50:21.240935 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:50:35 crc kubenswrapper[4841]: I1203 17:50:35.238528 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:50:35 crc kubenswrapper[4841]: E1203 17:50:35.239342 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:50:47 crc kubenswrapper[4841]: I1203 17:50:47.239467 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:50:47 crc kubenswrapper[4841]: E1203 17:50:47.240541 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:51:01 crc kubenswrapper[4841]: I1203 17:51:01.238663 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:51:01 crc kubenswrapper[4841]: E1203 17:51:01.239778 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:51:13 crc kubenswrapper[4841]: I1203 17:51:13.239648 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:51:13 crc kubenswrapper[4841]: E1203 17:51:13.240742 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:51:25 crc kubenswrapper[4841]: I1203 17:51:25.239204 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:51:25 crc kubenswrapper[4841]: E1203 17:51:25.240138 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:51:38 crc kubenswrapper[4841]: I1203 17:51:38.238538 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:51:38 crc kubenswrapper[4841]: E1203 17:51:38.239666 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:51:53 crc kubenswrapper[4841]: I1203 17:51:53.239503 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:51:53 crc kubenswrapper[4841]: E1203 17:51:53.242410 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:52:04 crc kubenswrapper[4841]: I1203 17:52:04.239243 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:52:04 crc kubenswrapper[4841]: E1203 17:52:04.240113 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:52:15 crc kubenswrapper[4841]: I1203 17:52:15.240510 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:52:15 crc kubenswrapper[4841]: E1203 17:52:15.241623 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:52:27 crc kubenswrapper[4841]: I1203 17:52:27.240537 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:52:27 crc kubenswrapper[4841]: E1203 17:52:27.241656 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:52:40 crc kubenswrapper[4841]: I1203 17:52:40.239529 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:52:40 crc kubenswrapper[4841]: E1203 17:52:40.240872 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:52:52 crc kubenswrapper[4841]: I1203 17:52:52.239851 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:52:52 crc kubenswrapper[4841]: E1203 17:52:52.240942 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:53:03 crc kubenswrapper[4841]: I1203 17:53:03.238734 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:53:03 crc kubenswrapper[4841]: E1203 17:53:03.239624 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:53:15 crc kubenswrapper[4841]: I1203 17:53:15.903733 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 17:53:18 crc kubenswrapper[4841]: I1203 17:53:18.238640 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:53:18 crc kubenswrapper[4841]: E1203 17:53:18.239385 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.309203 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb"] Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.312973 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.318806 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.327554 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb"] Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.402742 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.402817 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.403012 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddgff\" (UniqueName: \"kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.505886 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.506038 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.506129 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddgff\" (UniqueName: \"kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.508643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.508786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.537355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddgff\" (UniqueName: \"kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:29 crc kubenswrapper[4841]: I1203 17:53:29.653516 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:30 crc kubenswrapper[4841]: I1203 17:53:30.239932 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:53:30 crc kubenswrapper[4841]: E1203 17:53:30.240558 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:53:30 crc kubenswrapper[4841]: I1203 17:53:30.268167 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb"] Dec 03 17:53:30 crc kubenswrapper[4841]: W1203 17:53:30.269264 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d2b7f0f_9da1_4722_9f7d_851a5a5c5149.slice/crio-43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19 WatchSource:0}: Error finding container 43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19: Status 404 returned error can't find the container with id 43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19 Dec 03 17:53:31 crc kubenswrapper[4841]: I1203 17:53:31.148585 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerID="856e057d9b220cc020466407da1f4320b1b5130ff586ecfcefe9ccf2fcfa5d4f" exitCode=0 Dec 03 17:53:31 crc kubenswrapper[4841]: I1203 17:53:31.148716 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" event={"ID":"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149","Type":"ContainerDied","Data":"856e057d9b220cc020466407da1f4320b1b5130ff586ecfcefe9ccf2fcfa5d4f"} Dec 03 17:53:31 crc kubenswrapper[4841]: I1203 17:53:31.149402 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" event={"ID":"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149","Type":"ContainerStarted","Data":"43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19"} Dec 03 17:53:33 crc kubenswrapper[4841]: I1203 17:53:33.172336 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerID="451e4becd32e3fe24427c9be313ff4ecc8f990fcd8c2e7c1d6c9b70fd69899c1" exitCode=0 Dec 03 17:53:33 crc kubenswrapper[4841]: I1203 17:53:33.172453 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" event={"ID":"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149","Type":"ContainerDied","Data":"451e4becd32e3fe24427c9be313ff4ecc8f990fcd8c2e7c1d6c9b70fd69899c1"} Dec 03 17:53:34 crc kubenswrapper[4841]: I1203 17:53:34.189889 4841 generic.go:334] "Generic (PLEG): container finished" podID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerID="1cd6728767b0aae76bcad4cc22aae1aef8750462999af04474d460a340d0a52b" exitCode=0 Dec 03 17:53:34 crc kubenswrapper[4841]: I1203 17:53:34.190012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" event={"ID":"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149","Type":"ContainerDied","Data":"1cd6728767b0aae76bcad4cc22aae1aef8750462999af04474d460a340d0a52b"} Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.694263 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.838522 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util\") pod \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.838592 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle\") pod \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.838714 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddgff\" (UniqueName: \"kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff\") pod \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\" (UID: \"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149\") " Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.840410 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle" (OuterVolumeSpecName: "bundle") pod "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" (UID: "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.860858 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff" (OuterVolumeSpecName: "kube-api-access-ddgff") pod "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" (UID: "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149"). InnerVolumeSpecName "kube-api-access-ddgff". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.914260 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util" (OuterVolumeSpecName: "util") pod "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" (UID: "9d2b7f0f-9da1-4722-9f7d-851a5a5c5149"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.940651 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddgff\" (UniqueName: \"kubernetes.io/projected/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-kube-api-access-ddgff\") on node \"crc\" DevicePath \"\"" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.940683 4841 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-util\") on node \"crc\" DevicePath \"\"" Dec 03 17:53:35 crc kubenswrapper[4841]: I1203 17:53:35.940695 4841 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9d2b7f0f-9da1-4722-9f7d-851a5a5c5149-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:53:36 crc kubenswrapper[4841]: I1203 17:53:36.218898 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" event={"ID":"9d2b7f0f-9da1-4722-9f7d-851a5a5c5149","Type":"ContainerDied","Data":"43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19"} Dec 03 17:53:36 crc kubenswrapper[4841]: I1203 17:53:36.218998 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb" Dec 03 17:53:36 crc kubenswrapper[4841]: I1203 17:53:36.219008 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b5fd798f1c0873fd43089d9fdd42adc7efd2e5cc389a40d6fed68352862a19" Dec 03 17:53:42 crc kubenswrapper[4841]: I1203 17:53:42.238943 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:53:42 crc kubenswrapper[4841]: E1203 17:53:42.239592 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.012682 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq"] Dec 03 17:53:47 crc kubenswrapper[4841]: E1203 17:53:47.021765 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="pull" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.021791 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="pull" Dec 03 17:53:47 crc kubenswrapper[4841]: E1203 17:53:47.021833 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="extract" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.021843 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="extract" Dec 03 17:53:47 crc kubenswrapper[4841]: E1203 17:53:47.021857 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="util" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.021865 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="util" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.022129 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2b7f0f-9da1-4722-9f7d-851a5a5c5149" containerName="extract" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.025244 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.027655 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.028622 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.028798 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-t7qct" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.029062 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.029207 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.036028 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.036440 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-qlktf" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.036777 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.049977 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.082771 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.084083 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.130640 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.177041 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.177100 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.177210 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jj49\" (UniqueName: \"kubernetes.io/projected/ab0ef110-9ded-4408-9f52-0f8bbffd4f25-kube-api-access-6jj49\") pod \"obo-prometheus-operator-668cf9dfbb-f7jbq\" (UID: \"ab0ef110-9ded-4408-9f52-0f8bbffd4f25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.177255 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.177292 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.213955 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkbb8"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.215108 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.234810 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vkknj" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.235557 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.244423 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkbb8"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283315 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/123e62f6-3c8c-45f1-993c-12b1be324d9d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283426 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jj49\" (UniqueName: \"kubernetes.io/projected/ab0ef110-9ded-4408-9f52-0f8bbffd4f25-kube-api-access-6jj49\") pod \"obo-prometheus-operator-668cf9dfbb-f7jbq\" (UID: \"ab0ef110-9ded-4408-9f52-0f8bbffd4f25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283476 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkv7\" (UniqueName: \"kubernetes.io/projected/123e62f6-3c8c-45f1-993c-12b1be324d9d-kube-api-access-bgkv7\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283502 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283534 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283593 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.283622 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.293839 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.294215 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.298355 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3f6178c0-01f4-437f-b7bd-bcae5afcec18-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4\" (UID: \"3f6178c0-01f4-437f-b7bd-bcae5afcec18\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.298390 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/853a7cd4-09bc-40c0-8b4c-3c91fb152dbe-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v\" (UID: \"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.349591 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jj49\" (UniqueName: \"kubernetes.io/projected/ab0ef110-9ded-4408-9f52-0f8bbffd4f25-kube-api-access-6jj49\") pod \"obo-prometheus-operator-668cf9dfbb-f7jbq\" (UID: \"ab0ef110-9ded-4408-9f52-0f8bbffd4f25\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.356514 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.386139 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/123e62f6-3c8c-45f1-993c-12b1be324d9d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.386245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkv7\" (UniqueName: \"kubernetes.io/projected/123e62f6-3c8c-45f1-993c-12b1be324d9d-kube-api-access-bgkv7\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.393739 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/123e62f6-3c8c-45f1-993c-12b1be324d9d-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.411925 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.419145 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkv7\" (UniqueName: \"kubernetes.io/projected/123e62f6-3c8c-45f1-993c-12b1be324d9d-kube-api-access-bgkv7\") pod \"observability-operator-d8bb48f5d-rkbb8\" (UID: \"123e62f6-3c8c-45f1-993c-12b1be324d9d\") " pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.421848 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v2t4d"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.424255 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.431451 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-d6ph7" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.441288 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v2t4d"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.487584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f45374d5-3bf6-468b-9d32-be79178468a8-openshift-service-ca\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.487624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gg4w\" (UniqueName: \"kubernetes.io/projected/f45374d5-3bf6-468b-9d32-be79178468a8-kube-api-access-4gg4w\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.531520 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.589748 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f45374d5-3bf6-468b-9d32-be79178468a8-openshift-service-ca\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.589787 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gg4w\" (UniqueName: \"kubernetes.io/projected/f45374d5-3bf6-468b-9d32-be79178468a8-kube-api-access-4gg4w\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.590964 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/f45374d5-3bf6-468b-9d32-be79178468a8-openshift-service-ca\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.610450 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gg4w\" (UniqueName: \"kubernetes.io/projected/f45374d5-3bf6-468b-9d32-be79178468a8-kube-api-access-4gg4w\") pod \"perses-operator-5446b9c989-v2t4d\" (UID: \"f45374d5-3bf6-468b-9d32-be79178468a8\") " pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.645695 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.740740 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.846666 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.977891 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v"] Dec 03 17:53:47 crc kubenswrapper[4841]: I1203 17:53:47.987018 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq"] Dec 03 17:53:47 crc kubenswrapper[4841]: W1203 17:53:47.995719 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod853a7cd4_09bc_40c0_8b4c_3c91fb152dbe.slice/crio-790a4fdf5fd6efdb52270e4457c00b706478595001ace1c75a15e3fb19f25d3c WatchSource:0}: Error finding container 790a4fdf5fd6efdb52270e4457c00b706478595001ace1c75a15e3fb19f25d3c: Status 404 returned error can't find the container with id 790a4fdf5fd6efdb52270e4457c00b706478595001ace1c75a15e3fb19f25d3c Dec 03 17:53:48 crc kubenswrapper[4841]: W1203 17:53:48.013095 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab0ef110_9ded_4408_9f52_0f8bbffd4f25.slice/crio-ad48ae5fef0a43c945a2bb24095ccb5c1adc5111e3cc5ddf6ce05d47f56f68f7 WatchSource:0}: Error finding container ad48ae5fef0a43c945a2bb24095ccb5c1adc5111e3cc5ddf6ce05d47f56f68f7: Status 404 returned error can't find the container with id ad48ae5fef0a43c945a2bb24095ccb5c1adc5111e3cc5ddf6ce05d47f56f68f7 Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.093456 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-rkbb8"] Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.344415 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" event={"ID":"123e62f6-3c8c-45f1-993c-12b1be324d9d","Type":"ContainerStarted","Data":"8b2f33973b940b6a0b6fbfa7d13521bbb7503d2a06d97ebf72e3a79ceb2b40b9"} Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.345343 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" event={"ID":"3f6178c0-01f4-437f-b7bd-bcae5afcec18","Type":"ContainerStarted","Data":"f7c7f25bcda3ceab6e058eec7d05b8e9d1a32b1f53519b68ab3a8ef741f5bded"} Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.346287 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" event={"ID":"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe","Type":"ContainerStarted","Data":"790a4fdf5fd6efdb52270e4457c00b706478595001ace1c75a15e3fb19f25d3c"} Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.347083 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" event={"ID":"ab0ef110-9ded-4408-9f52-0f8bbffd4f25","Type":"ContainerStarted","Data":"ad48ae5fef0a43c945a2bb24095ccb5c1adc5111e3cc5ddf6ce05d47f56f68f7"} Dec 03 17:53:48 crc kubenswrapper[4841]: I1203 17:53:48.374700 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-v2t4d"] Dec 03 17:53:49 crc kubenswrapper[4841]: I1203 17:53:49.361446 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" event={"ID":"f45374d5-3bf6-468b-9d32-be79178468a8","Type":"ContainerStarted","Data":"f5f1a511f55c9c167c3caa13864d24de3a9a883196d71528c7fc78ebd6e98165"} Dec 03 17:53:57 crc kubenswrapper[4841]: I1203 17:53:57.239361 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:53:57 crc kubenswrapper[4841]: E1203 17:53:57.241359 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:54:04 crc kubenswrapper[4841]: E1203 17:54:04.232468 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385" Dec 03 17:54:04 crc kubenswrapper[4841]: E1203 17:54:04.233031 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:perses-operator,Image:registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{134217728 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:openshift-service-ca,ReadOnly:true,MountPath:/ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4gg4w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000350000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod perses-operator-5446b9c989-v2t4d_openshift-operators(f45374d5-3bf6-468b-9d32-be79178468a8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:54:04 crc kubenswrapper[4841]: E1203 17:54:04.234211 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" podUID="f45374d5-3bf6-468b-9d32-be79178468a8" Dec 03 17:54:04 crc kubenswrapper[4841]: E1203 17:54:04.547084 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"perses-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/perses-rhel9-operator@sha256:9aec4c328ec43e40481e06ca5808deead74b75c0aacb90e9e72966c3fa14f385\\\"\"" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" podUID="f45374d5-3bf6-468b-9d32-be79178468a8" Dec 03 17:54:05 crc kubenswrapper[4841]: E1203 17:54:05.097192 4841 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3" Dec 03 17:54:05 crc kubenswrapper[4841]: E1203 17:54:05.097310 4841 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:1133c973c7472c665f910a722e19c8e2e27accb34b90fab67f14548627ce9c62,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6jj49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-668cf9dfbb-f7jbq_openshift-operators(ab0ef110-9ded-4408-9f52-0f8bbffd4f25): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 17:54:05 crc kubenswrapper[4841]: E1203 17:54:05.098558 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" podUID="ab0ef110-9ded-4408-9f52-0f8bbffd4f25" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.559352 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" event={"ID":"123e62f6-3c8c-45f1-993c-12b1be324d9d","Type":"ContainerStarted","Data":"c07fd426ae160a09eb6d4cf0ba1eaf5c218aaf1ab3effafd2802607354321bf9"} Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.559760 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.562653 4841 patch_prober.go:28] interesting pod/observability-operator-d8bb48f5d-rkbb8 container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.255:8081/healthz\": dial tcp 10.217.0.255:8081: connect: connection refused" start-of-body= Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.562710 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" podUID="123e62f6-3c8c-45f1-993c-12b1be324d9d" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.255:8081/healthz\": dial tcp 10.217.0.255:8081: connect: connection refused" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.562885 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" event={"ID":"3f6178c0-01f4-437f-b7bd-bcae5afcec18","Type":"ContainerStarted","Data":"16bd4624a78b60b0dd183e41fa565917d7bd50a34095ebdce3debc6d2378d21c"} Dec 03 17:54:05 crc kubenswrapper[4841]: E1203 17:54:05.567056 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:203cf5b9dc1460f09e75f58d8b5cf7df5e57c18c8c6a41c14b5e8977d83263f3\\\"\"" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" podUID="ab0ef110-9ded-4408-9f52-0f8bbffd4f25" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.592708 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" podStartSLOduration=1.5756231 podStartE2EDuration="18.592683193s" podCreationTimestamp="2025-12-03 17:53:47 +0000 UTC" firstStartedPulling="2025-12-03 17:53:48.106399571 +0000 UTC m=+3222.493920298" lastFinishedPulling="2025-12-03 17:54:05.123459664 +0000 UTC m=+3239.510980391" observedRunningTime="2025-12-03 17:54:05.588641312 +0000 UTC m=+3239.976162050" watchObservedRunningTime="2025-12-03 17:54:05.592683193 +0000 UTC m=+3239.980203930" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.626970 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" podStartSLOduration=1.541431373 podStartE2EDuration="18.62694882s" podCreationTimestamp="2025-12-03 17:53:47 +0000 UTC" firstStartedPulling="2025-12-03 17:53:48.010685743 +0000 UTC m=+3222.398206470" lastFinishedPulling="2025-12-03 17:54:05.09620301 +0000 UTC m=+3239.483723917" observedRunningTime="2025-12-03 17:54:05.623073634 +0000 UTC m=+3240.010594361" watchObservedRunningTime="2025-12-03 17:54:05.62694882 +0000 UTC m=+3240.014469547" Dec 03 17:54:05 crc kubenswrapper[4841]: I1203 17:54:05.645828 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4" podStartSLOduration=2.265766475 podStartE2EDuration="19.645812787s" podCreationTimestamp="2025-12-03 17:53:46 +0000 UTC" firstStartedPulling="2025-12-03 17:53:47.736821919 +0000 UTC m=+3222.124342646" lastFinishedPulling="2025-12-03 17:54:05.116868231 +0000 UTC m=+3239.504388958" observedRunningTime="2025-12-03 17:54:05.645769326 +0000 UTC m=+3240.033290063" watchObservedRunningTime="2025-12-03 17:54:05.645812787 +0000 UTC m=+3240.033333514" Dec 03 17:54:06 crc kubenswrapper[4841]: I1203 17:54:06.576737 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v" event={"ID":"853a7cd4-09bc-40c0-8b4c-3c91fb152dbe","Type":"ContainerStarted","Data":"e27e851ce096b3e9390524427115087d1a3151804a9b893ce4e686e7fd166c20"} Dec 03 17:54:06 crc kubenswrapper[4841]: I1203 17:54:06.582185 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-rkbb8" Dec 03 17:54:09 crc kubenswrapper[4841]: I1203 17:54:09.239381 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:54:09 crc kubenswrapper[4841]: E1203 17:54:09.240452 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:54:14 crc kubenswrapper[4841]: I1203 17:54:14.625118 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:14 crc kubenswrapper[4841]: I1203 17:54:14.625748 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-api" containerID="cri-o://eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98" gracePeriod=30 Dec 03 17:54:14 crc kubenswrapper[4841]: I1203 17:54:14.626106 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-notifier" containerID="cri-o://36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287" gracePeriod=30 Dec 03 17:54:14 crc kubenswrapper[4841]: I1203 17:54:14.626145 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-evaluator" containerID="cri-o://60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae" gracePeriod=30 Dec 03 17:54:14 crc kubenswrapper[4841]: I1203 17:54:14.626241 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-listener" containerID="cri-o://95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec" gracePeriod=30 Dec 03 17:54:15 crc kubenswrapper[4841]: I1203 17:54:15.668409 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerID="60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae" exitCode=0 Dec 03 17:54:15 crc kubenswrapper[4841]: I1203 17:54:15.668757 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerID="eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98" exitCode=0 Dec 03 17:54:15 crc kubenswrapper[4841]: I1203 17:54:15.668584 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerDied","Data":"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae"} Dec 03 17:54:15 crc kubenswrapper[4841]: I1203 17:54:15.668811 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerDied","Data":"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98"} Dec 03 17:54:16 crc kubenswrapper[4841]: I1203 17:54:16.246700 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:54:17 crc kubenswrapper[4841]: I1203 17:54:17.717241 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" event={"ID":"f45374d5-3bf6-468b-9d32-be79178468a8","Type":"ContainerStarted","Data":"034f2a8e12360e249d0d1e55e0830717e07beca390c9bb05526ecb49709f3073"} Dec 03 17:54:17 crc kubenswrapper[4841]: I1203 17:54:17.717819 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:54:20 crc kubenswrapper[4841]: I1203 17:54:20.292943 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" podStartSLOduration=4.5079756060000005 podStartE2EDuration="33.292925536s" podCreationTimestamp="2025-12-03 17:53:47 +0000 UTC" firstStartedPulling="2025-12-03 17:53:48.401627955 +0000 UTC m=+3222.789148692" lastFinishedPulling="2025-12-03 17:54:17.186577855 +0000 UTC m=+3251.574098622" observedRunningTime="2025-12-03 17:54:17.739348951 +0000 UTC m=+3252.126869718" watchObservedRunningTime="2025-12-03 17:54:20.292925536 +0000 UTC m=+3254.680446263" Dec 03 17:54:20 crc kubenswrapper[4841]: I1203 17:54:20.762275 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerID="36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287" exitCode=0 Dec 03 17:54:20 crc kubenswrapper[4841]: I1203 17:54:20.762321 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerDied","Data":"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287"} Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.453971 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584447 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584514 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chhht\" (UniqueName: \"kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584616 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584694 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584743 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.584781 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs\") pod \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\" (UID: \"fc8a6ecd-e534-4e5f-af9d-93962acc5642\") " Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.591871 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht" (OuterVolumeSpecName: "kube-api-access-chhht") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "kube-api-access-chhht". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.592182 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts" (OuterVolumeSpecName: "scripts") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.661693 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.679062 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.686917 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.686948 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.686958 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.686966 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chhht\" (UniqueName: \"kubernetes.io/projected/fc8a6ecd-e534-4e5f-af9d-93962acc5642-kube-api-access-chhht\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.697576 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.721484 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data" (OuterVolumeSpecName: "config-data") pod "fc8a6ecd-e534-4e5f-af9d-93962acc5642" (UID: "fc8a6ecd-e534-4e5f-af9d-93962acc5642"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.772599 4841 generic.go:334] "Generic (PLEG): container finished" podID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerID="95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec" exitCode=0 Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.772696 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerDied","Data":"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec"} Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.772842 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"fc8a6ecd-e534-4e5f-af9d-93962acc5642","Type":"ContainerDied","Data":"75fd6422703ca21b13a50832dcf69d1285f6189cfa489e9444cb3482ea5ab219"} Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.772868 4841 scope.go:117] "RemoveContainer" containerID="95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.773131 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.774599 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" event={"ID":"ab0ef110-9ded-4408-9f52-0f8bbffd4f25","Type":"ContainerStarted","Data":"332da006492696e9e4fba82e4b6620372b338bd6404c88d89bb0d3c0c71e541b"} Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.788303 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.788336 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc8a6ecd-e534-4e5f-af9d-93962acc5642-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.796898 4841 scope.go:117] "RemoveContainer" containerID="36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.795473 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-f7jbq" podStartSLOduration=2.800119495 podStartE2EDuration="35.795451709s" podCreationTimestamp="2025-12-03 17:53:46 +0000 UTC" firstStartedPulling="2025-12-03 17:53:48.019742288 +0000 UTC m=+3222.407263015" lastFinishedPulling="2025-12-03 17:54:21.015074502 +0000 UTC m=+3255.402595229" observedRunningTime="2025-12-03 17:54:21.790162338 +0000 UTC m=+3256.177683085" watchObservedRunningTime="2025-12-03 17:54:21.795451709 +0000 UTC m=+3256.182972436" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.839779 4841 scope.go:117] "RemoveContainer" containerID="60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.863097 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.875463 4841 scope.go:117] "RemoveContainer" containerID="eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.883482 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.894821 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.895500 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-notifier" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895517 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-notifier" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.895531 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-evaluator" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895537 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-evaluator" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.895564 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-api" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895570 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-api" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.895601 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-listener" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895607 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-listener" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895948 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-evaluator" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895975 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-listener" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.895989 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-api" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.896008 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" containerName="aodh-notifier" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.899417 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.905750 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.906046 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.906237 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.906504 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.907063 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.914603 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.928727 4841 scope.go:117] "RemoveContainer" containerID="95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.929570 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec\": container with ID starting with 95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec not found: ID does not exist" containerID="95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.929609 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec"} err="failed to get container status \"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec\": rpc error: code = NotFound desc = could not find container \"95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec\": container with ID starting with 95183a35feb597c1874caf22338bc58db0d3def45390a058ff7d7f4a50a16dec not found: ID does not exist" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.929634 4841 scope.go:117] "RemoveContainer" containerID="36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.930027 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287\": container with ID starting with 36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287 not found: ID does not exist" containerID="36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.930083 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287"} err="failed to get container status \"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287\": rpc error: code = NotFound desc = could not find container \"36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287\": container with ID starting with 36a4849993e808e583503c6699e710b1885b38063ce21a668ede5946269f2287 not found: ID does not exist" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.930110 4841 scope.go:117] "RemoveContainer" containerID="60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.930356 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae\": container with ID starting with 60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae not found: ID does not exist" containerID="60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.930441 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae"} err="failed to get container status \"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae\": rpc error: code = NotFound desc = could not find container \"60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae\": container with ID starting with 60fee266edb967a7ceb5c73dee1450cb7bf08cf52c366f5ca087ba863b4a8dae not found: ID does not exist" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.930524 4841 scope.go:117] "RemoveContainer" containerID="eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.930758 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98\": container with ID starting with eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98 not found: ID does not exist" containerID="eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.930780 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98"} err="failed to get container status \"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98\": rpc error: code = NotFound desc = could not find container \"eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98\": container with ID starting with eb5dad6fcb0ec9c79887143daf4cbc853825f058c4990dc768eee8e9eed08d98 not found: ID does not exist" Dec 03 17:54:21 crc kubenswrapper[4841]: E1203 17:54:21.978557 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc8a6ecd_e534_4e5f_af9d_93962acc5642.slice\": RecentStats: unable to find data in memory cache]" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.992793 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.992856 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.992913 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.992936 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.992976 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkb7\" (UniqueName: \"kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:21 crc kubenswrapper[4841]: I1203 17:54:21.993029 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094225 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094314 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094347 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094373 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094393 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.094435 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkb7\" (UniqueName: \"kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.098746 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.098790 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.098952 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.099264 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.099797 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.110309 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkb7\" (UniqueName: \"kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7\") pod \"aodh-0\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.227130 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.254228 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc8a6ecd-e534-4e5f-af9d-93962acc5642" path="/var/lib/kubelet/pods/fc8a6ecd-e534-4e5f-af9d-93962acc5642/volumes" Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.709819 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:54:22 crc kubenswrapper[4841]: I1203 17:54:22.784172 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerStarted","Data":"5357bb547fc286182de9277791c7f92f61f97c64dc2fe67cef50f5322db8243d"} Dec 03 17:54:23 crc kubenswrapper[4841]: I1203 17:54:23.238774 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:54:23 crc kubenswrapper[4841]: E1203 17:54:23.239349 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:54:23 crc kubenswrapper[4841]: I1203 17:54:23.796780 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerStarted","Data":"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0"} Dec 03 17:54:24 crc kubenswrapper[4841]: I1203 17:54:24.812688 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerStarted","Data":"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3"} Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.823302 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerStarted","Data":"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8"} Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.978266 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.980732 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.983840 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.983847 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.983914 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.984193 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-gbtpl" Dec 03 17:54:25 crc kubenswrapper[4841]: I1203 17:54:25.984831 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.009537 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077318 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077366 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077419 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077439 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077643 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.077895 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.078140 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrh7\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-kube-api-access-ktrh7\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180004 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180117 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180167 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrh7\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-kube-api-access-ktrh7\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180204 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180221 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180264 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180285 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.180540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.184121 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.184212 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/00f46b57-a05f-43fe-b97d-1f59137df281-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.184492 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.185016 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/00f46b57-a05f-43fe-b97d-1f59137df281-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.187312 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.201233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrh7\" (UniqueName: \"kubernetes.io/projected/00f46b57-a05f-43fe-b97d-1f59137df281-kube-api-access-ktrh7\") pod \"alertmanager-metric-storage-0\" (UID: \"00f46b57-a05f-43fe-b97d-1f59137df281\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.304767 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.829071 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.838102 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerStarted","Data":"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a"} Dec 03 17:54:26 crc kubenswrapper[4841]: I1203 17:54:26.868178 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.978666822 podStartE2EDuration="5.868158328s" podCreationTimestamp="2025-12-03 17:54:21 +0000 UTC" firstStartedPulling="2025-12-03 17:54:22.721001778 +0000 UTC m=+3257.108522505" lastFinishedPulling="2025-12-03 17:54:25.610493284 +0000 UTC m=+3259.998014011" observedRunningTime="2025-12-03 17:54:26.865667137 +0000 UTC m=+3261.253187864" watchObservedRunningTime="2025-12-03 17:54:26.868158328 +0000 UTC m=+3261.255679075" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.037196 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.044424 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.046675 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.047181 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4mqjg" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.048104 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.048403 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.049036 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.049193 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.054660 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103341 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nj8f\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103421 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103483 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103513 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103555 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103598 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103642 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.103661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.205673 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.205952 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206037 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206137 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nj8f\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206245 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206319 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206406 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206490 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206581 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.206664 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.213256 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.213707 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.214542 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.214814 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.220630 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.228195 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nj8f\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.259035 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.381940 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.849724 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"00f46b57-a05f-43fe-b97d-1f59137df281","Type":"ContainerStarted","Data":"a3863c8efd71b451ec3239b8fda4dabda1d8f29894f8d9fec2a8853bdd3cf102"} Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.855992 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-v2t4d" Dec 03 17:54:27 crc kubenswrapper[4841]: I1203 17:54:27.990612 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:54:28 crc kubenswrapper[4841]: I1203 17:54:28.863150 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerStarted","Data":"ee2c16c440c79bcbc951c7f2e4a6998b06f88db0a63ef000f249796b614440e8"} Dec 03 17:54:32 crc kubenswrapper[4841]: I1203 17:54:32.904407 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerStarted","Data":"531769ff38addc9f929b0cd933911ab967c6b56cbae5ecf324f79796f04e665f"} Dec 03 17:54:32 crc kubenswrapper[4841]: I1203 17:54:32.906452 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"00f46b57-a05f-43fe-b97d-1f59137df281","Type":"ContainerStarted","Data":"b2479df85a8deb48833e001e4c92dff1d0a1724a321e55230a7aad05f3170e7c"} Dec 03 17:54:37 crc kubenswrapper[4841]: I1203 17:54:37.238589 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:54:37 crc kubenswrapper[4841]: E1203 17:54:37.239504 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:54:38 crc kubenswrapper[4841]: I1203 17:54:38.976363 4841 generic.go:334] "Generic (PLEG): container finished" podID="00f46b57-a05f-43fe-b97d-1f59137df281" containerID="b2479df85a8deb48833e001e4c92dff1d0a1724a321e55230a7aad05f3170e7c" exitCode=0 Dec 03 17:54:38 crc kubenswrapper[4841]: I1203 17:54:38.976500 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"00f46b57-a05f-43fe-b97d-1f59137df281","Type":"ContainerDied","Data":"b2479df85a8deb48833e001e4c92dff1d0a1724a321e55230a7aad05f3170e7c"} Dec 03 17:54:39 crc kubenswrapper[4841]: I1203 17:54:39.990864 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerID="531769ff38addc9f929b0cd933911ab967c6b56cbae5ecf324f79796f04e665f" exitCode=0 Dec 03 17:54:39 crc kubenswrapper[4841]: I1203 17:54:39.990966 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerDied","Data":"531769ff38addc9f929b0cd933911ab967c6b56cbae5ecf324f79796f04e665f"} Dec 03 17:54:42 crc kubenswrapper[4841]: I1203 17:54:42.078985 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"00f46b57-a05f-43fe-b97d-1f59137df281","Type":"ContainerStarted","Data":"aea448eda5d799d820a79ae175aef5b9681ae4e51f07566c657389bcd04229b5"} Dec 03 17:54:48 crc kubenswrapper[4841]: I1203 17:54:48.157534 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"00f46b57-a05f-43fe-b97d-1f59137df281","Type":"ContainerStarted","Data":"3f42f0faccff45f085eb9976935bc5a872f6706c5b72c2e96348ed8f05409002"} Dec 03 17:54:49 crc kubenswrapper[4841]: I1203 17:54:49.167660 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:49 crc kubenswrapper[4841]: I1203 17:54:49.171275 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 17:54:49 crc kubenswrapper[4841]: I1203 17:54:49.214116 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=9.477188345 podStartE2EDuration="24.214092597s" podCreationTimestamp="2025-12-03 17:54:25 +0000 UTC" firstStartedPulling="2025-12-03 17:54:26.834645369 +0000 UTC m=+3261.222166096" lastFinishedPulling="2025-12-03 17:54:41.571549621 +0000 UTC m=+3275.959070348" observedRunningTime="2025-12-03 17:54:49.196284287 +0000 UTC m=+3283.583805074" watchObservedRunningTime="2025-12-03 17:54:49.214092597 +0000 UTC m=+3283.601613334" Dec 03 17:54:50 crc kubenswrapper[4841]: I1203 17:54:50.179445 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerStarted","Data":"6972bf8ee4788e6ef5374972b90e6f8c7283ddeaea8b2c5258363474ffd48dcf"} Dec 03 17:54:52 crc kubenswrapper[4841]: I1203 17:54:52.238954 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:54:52 crc kubenswrapper[4841]: E1203 17:54:52.240365 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:54:54 crc kubenswrapper[4841]: I1203 17:54:54.230755 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerStarted","Data":"b9f174049211a83d89670abc9967d5c0324f27084e177d3ad4397b0a2d368e1c"} Dec 03 17:54:59 crc kubenswrapper[4841]: I1203 17:54:59.286314 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerStarted","Data":"13a4a99dae6715fc5d0d1fd290c234680802065de27880297f22da33abd21be8"} Dec 03 17:54:59 crc kubenswrapper[4841]: I1203 17:54:59.322593 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=2.599505591 podStartE2EDuration="33.322575022s" podCreationTimestamp="2025-12-03 17:54:26 +0000 UTC" firstStartedPulling="2025-12-03 17:54:27.996878392 +0000 UTC m=+3262.384399119" lastFinishedPulling="2025-12-03 17:54:58.719947773 +0000 UTC m=+3293.107468550" observedRunningTime="2025-12-03 17:54:59.314983204 +0000 UTC m=+3293.702503951" watchObservedRunningTime="2025-12-03 17:54:59.322575022 +0000 UTC m=+3293.710095749" Dec 03 17:55:02 crc kubenswrapper[4841]: I1203 17:55:02.383002 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:07 crc kubenswrapper[4841]: I1203 17:55:07.239740 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:55:07 crc kubenswrapper[4841]: E1203 17:55:07.240976 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 17:55:12 crc kubenswrapper[4841]: I1203 17:55:12.382268 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:12 crc kubenswrapper[4841]: I1203 17:55:12.385172 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:12 crc kubenswrapper[4841]: I1203 17:55:12.452288 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.215073 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.215697 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" containerName="openstackclient" containerID="cri-o://cf3231110f26c90878f3e6247ea3f5f1ea144fbe936ee3be5855e219d0546edb" gracePeriod=2 Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.232478 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.253043 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 17:55:14 crc kubenswrapper[4841]: E1203 17:55:14.253417 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" containerName="openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.253437 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" containerName="openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.253708 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" containerName="openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.254515 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.255846 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.324050 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-kube-api-access-6vndk\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.324346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config-secret\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.324508 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.324538 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.427048 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-kube-api-access-6vndk\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.427103 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config-secret\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.427187 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.427256 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.428202 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.432774 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-combined-ca-bundle\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.433306 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-openstack-config-secret\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.446157 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vndk\" (UniqueName: \"kubernetes.io/projected/3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7-kube-api-access-6vndk\") pod \"openstackclient\" (UID: \"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7\") " pod="openstack/openstackclient" Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.519618 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.519995 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-api" containerID="cri-o://2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0" gracePeriod=30 Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.520401 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-listener" containerID="cri-o://990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a" gracePeriod=30 Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.520459 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-notifier" containerID="cri-o://fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8" gracePeriod=30 Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.520501 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-evaluator" containerID="cri-o://286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3" gracePeriod=30 Dec 03 17:55:14 crc kubenswrapper[4841]: I1203 17:55:14.590849 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.186208 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.481237 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7","Type":"ContainerStarted","Data":"a05090500728c4cd41cb675bbedd598d44871029a858128fed73d3b76c69e9af"} Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.481529 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7","Type":"ContainerStarted","Data":"b9d4f3a37dad797387c79fa6931718fb652d08ec3536f225d69bd1c151a7fdf4"} Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.486513 4841 generic.go:334] "Generic (PLEG): container finished" podID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerID="286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3" exitCode=0 Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.486548 4841 generic.go:334] "Generic (PLEG): container finished" podID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerID="2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0" exitCode=0 Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.486575 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerDied","Data":"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3"} Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.486603 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerDied","Data":"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0"} Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.507856 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.507839255 podStartE2EDuration="1.507839255s" podCreationTimestamp="2025-12-03 17:55:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:55:15.504710927 +0000 UTC m=+3309.892231654" watchObservedRunningTime="2025-12-03 17:55:15.507839255 +0000 UTC m=+3309.895359982" Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.583852 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.584185 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="prometheus" containerID="cri-o://6972bf8ee4788e6ef5374972b90e6f8c7283ddeaea8b2c5258363474ffd48dcf" gracePeriod=600 Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.584311 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="thanos-sidecar" containerID="cri-o://13a4a99dae6715fc5d0d1fd290c234680802065de27880297f22da33abd21be8" gracePeriod=600 Dec 03 17:55:15 crc kubenswrapper[4841]: I1203 17:55:15.584354 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="config-reloader" containerID="cri-o://b9f174049211a83d89670abc9967d5c0324f27084e177d3ad4397b0a2d368e1c" gracePeriod=600 Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.496744 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" containerID="cf3231110f26c90878f3e6247ea3f5f1ea144fbe936ee3be5855e219d0546edb" exitCode=137 Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.497055 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="907b5e1ebe9d857996e5b110022c5d35112fb529b6dfea81bba9a29a312603db" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.499281 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerID="13a4a99dae6715fc5d0d1fd290c234680802065de27880297f22da33abd21be8" exitCode=0 Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.499298 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerID="b9f174049211a83d89670abc9967d5c0324f27084e177d3ad4397b0a2d368e1c" exitCode=0 Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.499304 4841 generic.go:334] "Generic (PLEG): container finished" podID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerID="6972bf8ee4788e6ef5374972b90e6f8c7283ddeaea8b2c5258363474ffd48dcf" exitCode=0 Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.500162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerDied","Data":"13a4a99dae6715fc5d0d1fd290c234680802065de27880297f22da33abd21be8"} Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.500189 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerDied","Data":"b9f174049211a83d89670abc9967d5c0324f27084e177d3ad4397b0a2d368e1c"} Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.500200 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerDied","Data":"6972bf8ee4788e6ef5374972b90e6f8c7283ddeaea8b2c5258363474ffd48dcf"} Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.560047 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.563405 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" podUID="3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.569605 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.676924 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nj8f\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677007 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle\") pod \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677037 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677062 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config\") pod \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677092 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677135 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677187 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677235 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llbcf\" (UniqueName: \"kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf\") pod \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677329 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677377 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677401 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret\") pod \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\" (UID: \"a9e534e4-4bcc-40f6-b7f3-de879530ebf6\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.677460 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\" (UID: \"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632\") " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.678620 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.679897 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.683805 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f" (OuterVolumeSpecName: "kube-api-access-5nj8f") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "kube-api-access-5nj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.697412 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out" (OuterVolumeSpecName: "config-out") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.700461 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config" (OuterVolumeSpecName: "config") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.702602 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.702886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.704015 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.719555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf" (OuterVolumeSpecName: "kube-api-access-llbcf") pod "a9e534e4-4bcc-40f6-b7f3-de879530ebf6" (UID: "a9e534e4-4bcc-40f6-b7f3-de879530ebf6"). InnerVolumeSpecName "kube-api-access-llbcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.743729 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a9e534e4-4bcc-40f6-b7f3-de879530ebf6" (UID: "a9e534e4-4bcc-40f6-b7f3-de879530ebf6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.745690 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "a9e534e4-4bcc-40f6-b7f3-de879530ebf6" (UID: "a9e534e4-4bcc-40f6-b7f3-de879530ebf6"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.758780 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config" (OuterVolumeSpecName: "web-config") pod "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" (UID: "a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.771275 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "a9e534e4-4bcc-40f6-b7f3-de879530ebf6" (UID: "a9e534e4-4bcc-40f6-b7f3-de879530ebf6"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783370 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783403 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783463 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783475 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nj8f\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-kube-api-access-5nj8f\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783485 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783494 4841 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783502 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783530 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783538 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783545 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llbcf\" (UniqueName: \"kubernetes.io/projected/a9e534e4-4bcc-40f6-b7f3-de879530ebf6-kube-api-access-llbcf\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.783554 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.805847 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 17:55:16 crc kubenswrapper[4841]: I1203 17:55:16.884730 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.509845 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632","Type":"ContainerDied","Data":"ee2c16c440c79bcbc951c7f2e4a6998b06f88db0a63ef000f249796b614440e8"} Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.509884 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.509893 4841 scope.go:117] "RemoveContainer" containerID="13a4a99dae6715fc5d0d1fd290c234680802065de27880297f22da33abd21be8" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.513035 4841 generic.go:334] "Generic (PLEG): container finished" podID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerID="fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8" exitCode=0 Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.513070 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerDied","Data":"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8"} Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.513108 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.518738 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" podUID="3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.553960 4841 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" podUID="3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.555165 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.556265 4841 scope.go:117] "RemoveContainer" containerID="b9f174049211a83d89670abc9967d5c0324f27084e177d3ad4397b0a2d368e1c" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.566172 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.589791 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590270 4841 scope.go:117] "RemoveContainer" containerID="6972bf8ee4788e6ef5374972b90e6f8c7283ddeaea8b2c5258363474ffd48dcf" Dec 03 17:55:17 crc kubenswrapper[4841]: E1203 17:55:17.590418 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="prometheus" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590428 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="prometheus" Dec 03 17:55:17 crc kubenswrapper[4841]: E1203 17:55:17.590453 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="thanos-sidecar" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590468 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="thanos-sidecar" Dec 03 17:55:17 crc kubenswrapper[4841]: E1203 17:55:17.590494 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="config-reloader" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590502 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="config-reloader" Dec 03 17:55:17 crc kubenswrapper[4841]: E1203 17:55:17.590550 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="init-config-reloader" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590559 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="init-config-reloader" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590770 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="config-reloader" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590787 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="thanos-sidecar" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.590801 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" containerName="prometheus" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.593077 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.622732 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.623844 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.624396 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.624454 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4mqjg" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.624743 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.625009 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.625817 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.663510 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.665176 4841 scope.go:117] "RemoveContainer" containerID="531769ff38addc9f929b0cd933911ab967c6b56cbae5ecf324f79796f04e665f" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716520 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716624 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716701 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4ccr\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716725 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716804 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716818 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716837 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716868 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716914 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.716991 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.717011 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818230 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818280 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818313 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4ccr\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818344 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818421 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818438 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818462 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818495 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818537 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818569 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.818589 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.819456 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.820376 4841 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.825112 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.826049 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.826201 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.828747 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.830304 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.836529 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.836690 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.837643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.846317 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4ccr\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.862233 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"prometheus-metric-storage-0\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:17 crc kubenswrapper[4841]: I1203 17:55:17.944452 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:18 crc kubenswrapper[4841]: I1203 17:55:18.258168 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632" path="/var/lib/kubelet/pods/a9a5bc11-33ca-4bcc-bfd8-ffa3d2c46632/volumes" Dec 03 17:55:18 crc kubenswrapper[4841]: I1203 17:55:18.259217 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e534e4-4bcc-40f6-b7f3-de879530ebf6" path="/var/lib/kubelet/pods/a9e534e4-4bcc-40f6-b7f3-de879530ebf6/volumes" Dec 03 17:55:18 crc kubenswrapper[4841]: W1203 17:55:18.299154 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d4bbe0b_bc34_4450_8ebd_7269e9473f02.slice/crio-5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db WatchSource:0}: Error finding container 5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db: Status 404 returned error can't find the container with id 5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db Dec 03 17:55:18 crc kubenswrapper[4841]: I1203 17:55:18.303663 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:55:18 crc kubenswrapper[4841]: I1203 17:55:18.526300 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerStarted","Data":"5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db"} Dec 03 17:55:18 crc kubenswrapper[4841]: I1203 17:55:18.903777 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.053312 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.053479 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.053532 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.053580 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.054038 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.054077 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqkb7\" (UniqueName: \"kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7\") pod \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\" (UID: \"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c\") " Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.059283 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7" (OuterVolumeSpecName: "kube-api-access-tqkb7") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "kube-api-access-tqkb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.059848 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts" (OuterVolumeSpecName: "scripts") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.132383 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.151886 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.156275 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.156302 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.156313 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.156344 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqkb7\" (UniqueName: \"kubernetes.io/projected/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-kube-api-access-tqkb7\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.171983 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.191510 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data" (OuterVolumeSpecName: "config-data") pod "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" (UID: "c65818a5-dcec-4ea5-8fe7-f7a02f283b3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.259469 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.259530 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.541618 4841 generic.go:334] "Generic (PLEG): container finished" podID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerID="990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a" exitCode=0 Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.541682 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerDied","Data":"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a"} Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.541707 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.541731 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"c65818a5-dcec-4ea5-8fe7-f7a02f283b3c","Type":"ContainerDied","Data":"5357bb547fc286182de9277791c7f92f61f97c64dc2fe67cef50f5322db8243d"} Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.541771 4841 scope.go:117] "RemoveContainer" containerID="990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.563142 4841 scope.go:117] "RemoveContainer" containerID="fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.596931 4841 scope.go:117] "RemoveContainer" containerID="286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.602233 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.618962 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.640457 4841 scope.go:117] "RemoveContainer" containerID="2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.640629 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.641229 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-listener" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641254 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-listener" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.641280 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-api" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641292 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-api" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.641312 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-notifier" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641323 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-notifier" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.641376 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-evaluator" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641388 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-evaluator" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641688 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-api" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641724 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-evaluator" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641758 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-listener" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.641804 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" containerName="aodh-notifier" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.649296 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.649413 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.653607 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.654008 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.654734 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.655138 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.655461 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.678270 4841 scope.go:117] "RemoveContainer" containerID="990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.678823 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a\": container with ID starting with 990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a not found: ID does not exist" containerID="990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.678895 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a"} err="failed to get container status \"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a\": rpc error: code = NotFound desc = could not find container \"990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a\": container with ID starting with 990cafe2731cf26308399491f8a583a104233502427b7d0266bafc2c9200905a not found: ID does not exist" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.678994 4841 scope.go:117] "RemoveContainer" containerID="fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.679430 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8\": container with ID starting with fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8 not found: ID does not exist" containerID="fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.679474 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8"} err="failed to get container status \"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8\": rpc error: code = NotFound desc = could not find container \"fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8\": container with ID starting with fe9b7748dd00106bcc9d274d807ad554afc86ff3b7e2a8e9be984e8efcf4fac8 not found: ID does not exist" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.679507 4841 scope.go:117] "RemoveContainer" containerID="286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.679859 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3\": container with ID starting with 286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3 not found: ID does not exist" containerID="286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.679900 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3"} err="failed to get container status \"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3\": rpc error: code = NotFound desc = could not find container \"286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3\": container with ID starting with 286b81add3ea3663b208fe00459b73abb12bb3d6963f0a7933366e300ff443a3 not found: ID does not exist" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.679949 4841 scope.go:117] "RemoveContainer" containerID="2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0" Dec 03 17:55:19 crc kubenswrapper[4841]: E1203 17:55:19.680295 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0\": container with ID starting with 2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0 not found: ID does not exist" containerID="2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.680321 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0"} err="failed to get container status \"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0\": rpc error: code = NotFound desc = could not find container \"2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0\": container with ID starting with 2e90fcdf453a8712f77b77efbcc82445fbc2315fbf033f216bac017bfe1fb8a0 not found: ID does not exist" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.769398 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.769710 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.769740 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.769951 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66c7p\" (UniqueName: \"kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.770055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.770129 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872470 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872538 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872633 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66c7p\" (UniqueName: \"kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872671 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.872702 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.879707 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.879715 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.879802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.879946 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.880727 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.900177 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66c7p\" (UniqueName: \"kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p\") pod \"aodh-0\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " pod="openstack/aodh-0" Dec 03 17:55:19 crc kubenswrapper[4841]: I1203 17:55:19.977176 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 17:55:20 crc kubenswrapper[4841]: I1203 17:55:20.249725 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c65818a5-dcec-4ea5-8fe7-f7a02f283b3c" path="/var/lib/kubelet/pods/c65818a5-dcec-4ea5-8fe7-f7a02f283b3c/volumes" Dec 03 17:55:20 crc kubenswrapper[4841]: I1203 17:55:20.498405 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 17:55:20 crc kubenswrapper[4841]: I1203 17:55:20.558250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerStarted","Data":"b7fbbd740d8e07d018391d1ac9d11345274df53b479e0456187df3a3d875a559"} Dec 03 17:55:21 crc kubenswrapper[4841]: I1203 17:55:21.239491 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:55:21 crc kubenswrapper[4841]: I1203 17:55:21.573516 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerStarted","Data":"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db"} Dec 03 17:55:22 crc kubenswrapper[4841]: I1203 17:55:22.584395 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerStarted","Data":"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa"} Dec 03 17:55:22 crc kubenswrapper[4841]: I1203 17:55:22.587026 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerStarted","Data":"5c27f1ebca17a3f9be1ab1ea3f0707941dde25a81ec866c7816a41f8f3ecd479"} Dec 03 17:55:22 crc kubenswrapper[4841]: I1203 17:55:22.597374 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72"} Dec 03 17:55:23 crc kubenswrapper[4841]: I1203 17:55:23.617563 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerStarted","Data":"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4"} Dec 03 17:55:23 crc kubenswrapper[4841]: I1203 17:55:23.618654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerStarted","Data":"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934"} Dec 03 17:55:23 crc kubenswrapper[4841]: I1203 17:55:23.663376 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.000567576 podStartE2EDuration="4.663356303s" podCreationTimestamp="2025-12-03 17:55:19 +0000 UTC" firstStartedPulling="2025-12-03 17:55:20.510496411 +0000 UTC m=+3314.898017138" lastFinishedPulling="2025-12-03 17:55:23.173285138 +0000 UTC m=+3317.560805865" observedRunningTime="2025-12-03 17:55:23.652707929 +0000 UTC m=+3318.040228736" watchObservedRunningTime="2025-12-03 17:55:23.663356303 +0000 UTC m=+3318.050877030" Dec 03 17:55:25 crc kubenswrapper[4841]: I1203 17:55:25.611552 4841 scope.go:117] "RemoveContainer" containerID="cf3231110f26c90878f3e6247ea3f5f1ea144fbe936ee3be5855e219d0546edb" Dec 03 17:55:30 crc kubenswrapper[4841]: I1203 17:55:30.710987 4841 generic.go:334] "Generic (PLEG): container finished" podID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerID="5c27f1ebca17a3f9be1ab1ea3f0707941dde25a81ec866c7816a41f8f3ecd479" exitCode=0 Dec 03 17:55:30 crc kubenswrapper[4841]: I1203 17:55:30.711112 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerDied","Data":"5c27f1ebca17a3f9be1ab1ea3f0707941dde25a81ec866c7816a41f8f3ecd479"} Dec 03 17:55:31 crc kubenswrapper[4841]: I1203 17:55:31.726130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerStarted","Data":"27109f5910226ee377a1094a5c3273443f2c273b4d853b4378f27b347f25c7cc"} Dec 03 17:55:35 crc kubenswrapper[4841]: I1203 17:55:35.787664 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerStarted","Data":"f846d9bb3ec456d8b2d776551f93d3f06b4b586b6dd35ea14a65e48617f1b835"} Dec 03 17:55:35 crc kubenswrapper[4841]: I1203 17:55:35.788203 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerStarted","Data":"c874d8d6035533f3b70ba885e2e7152bb42998e2a90e71342dba8aebb97aa00d"} Dec 03 17:55:35 crc kubenswrapper[4841]: I1203 17:55:35.820780 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.820757907 podStartE2EDuration="18.820757907s" podCreationTimestamp="2025-12-03 17:55:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:55:35.818252855 +0000 UTC m=+3330.205773612" watchObservedRunningTime="2025-12-03 17:55:35.820757907 +0000 UTC m=+3330.208278644" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.608976 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.613105 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.641032 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.668264 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.668616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.668922 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjwvt\" (UniqueName: \"kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.771566 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjwvt\" (UniqueName: \"kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.771927 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.772078 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.772456 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.772564 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.804923 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjwvt\" (UniqueName: \"kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt\") pod \"community-operators-5ncfz\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:36 crc kubenswrapper[4841]: I1203 17:55:36.941639 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:37 crc kubenswrapper[4841]: I1203 17:55:37.400103 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:37 crc kubenswrapper[4841]: W1203 17:55:37.402628 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7772106_f808_4f8d_88a5_8d92ee2fa3cb.slice/crio-ac48876b6b8843770a196896120367ca3ba3f47248fa7c762bfc1ce0f9ebeaae WatchSource:0}: Error finding container ac48876b6b8843770a196896120367ca3ba3f47248fa7c762bfc1ce0f9ebeaae: Status 404 returned error can't find the container with id ac48876b6b8843770a196896120367ca3ba3f47248fa7c762bfc1ce0f9ebeaae Dec 03 17:55:37 crc kubenswrapper[4841]: I1203 17:55:37.805358 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerID="ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767" exitCode=0 Dec 03 17:55:37 crc kubenswrapper[4841]: I1203 17:55:37.805439 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerDied","Data":"ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767"} Dec 03 17:55:37 crc kubenswrapper[4841]: I1203 17:55:37.805743 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerStarted","Data":"ac48876b6b8843770a196896120367ca3ba3f47248fa7c762bfc1ce0f9ebeaae"} Dec 03 17:55:37 crc kubenswrapper[4841]: I1203 17:55:37.945357 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:38 crc kubenswrapper[4841]: I1203 17:55:38.818670 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerStarted","Data":"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98"} Dec 03 17:55:39 crc kubenswrapper[4841]: I1203 17:55:39.829178 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerID="99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98" exitCode=0 Dec 03 17:55:39 crc kubenswrapper[4841]: I1203 17:55:39.829223 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerDied","Data":"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98"} Dec 03 17:55:40 crc kubenswrapper[4841]: I1203 17:55:40.857607 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerStarted","Data":"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7"} Dec 03 17:55:40 crc kubenswrapper[4841]: I1203 17:55:40.878415 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5ncfz" podStartSLOduration=2.430484203 podStartE2EDuration="4.878399864s" podCreationTimestamp="2025-12-03 17:55:36 +0000 UTC" firstStartedPulling="2025-12-03 17:55:37.806526816 +0000 UTC m=+3332.194047533" lastFinishedPulling="2025-12-03 17:55:40.254442477 +0000 UTC m=+3334.641963194" observedRunningTime="2025-12-03 17:55:40.87621533 +0000 UTC m=+3335.263736057" watchObservedRunningTime="2025-12-03 17:55:40.878399864 +0000 UTC m=+3335.265920591" Dec 03 17:55:46 crc kubenswrapper[4841]: I1203 17:55:46.942787 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:46 crc kubenswrapper[4841]: I1203 17:55:46.943369 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:47 crc kubenswrapper[4841]: I1203 17:55:47.035759 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:47 crc kubenswrapper[4841]: I1203 17:55:47.945738 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:47 crc kubenswrapper[4841]: I1203 17:55:47.953699 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:48 crc kubenswrapper[4841]: I1203 17:55:48.027546 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:48 crc kubenswrapper[4841]: I1203 17:55:48.103894 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:48 crc kubenswrapper[4841]: I1203 17:55:48.965109 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 17:55:49 crc kubenswrapper[4841]: I1203 17:55:49.970921 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5ncfz" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="registry-server" containerID="cri-o://a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7" gracePeriod=2 Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.465531 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.621558 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities\") pod \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.621782 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjwvt\" (UniqueName: \"kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt\") pod \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.621830 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content\") pod \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\" (UID: \"f7772106-f808-4f8d-88a5-8d92ee2fa3cb\") " Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.623111 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities" (OuterVolumeSpecName: "utilities") pod "f7772106-f808-4f8d-88a5-8d92ee2fa3cb" (UID: "f7772106-f808-4f8d-88a5-8d92ee2fa3cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.629570 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt" (OuterVolumeSpecName: "kube-api-access-wjwvt") pod "f7772106-f808-4f8d-88a5-8d92ee2fa3cb" (UID: "f7772106-f808-4f8d-88a5-8d92ee2fa3cb"). InnerVolumeSpecName "kube-api-access-wjwvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.700795 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f7772106-f808-4f8d-88a5-8d92ee2fa3cb" (UID: "f7772106-f808-4f8d-88a5-8d92ee2fa3cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.724533 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjwvt\" (UniqueName: \"kubernetes.io/projected/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-kube-api-access-wjwvt\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.724585 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.724603 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f7772106-f808-4f8d-88a5-8d92ee2fa3cb-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.996724 4841 generic.go:334] "Generic (PLEG): container finished" podID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerID="a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7" exitCode=0 Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.996805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerDied","Data":"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7"} Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.996864 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5ncfz" event={"ID":"f7772106-f808-4f8d-88a5-8d92ee2fa3cb","Type":"ContainerDied","Data":"ac48876b6b8843770a196896120367ca3ba3f47248fa7c762bfc1ce0f9ebeaae"} Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.996899 4841 scope.go:117] "RemoveContainer" containerID="a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7" Dec 03 17:55:50 crc kubenswrapper[4841]: I1203 17:55:50.997190 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5ncfz" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.062810 4841 scope.go:117] "RemoveContainer" containerID="99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.070184 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.097053 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5ncfz"] Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.104125 4841 scope.go:117] "RemoveContainer" containerID="ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.165629 4841 scope.go:117] "RemoveContainer" containerID="a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7" Dec 03 17:55:51 crc kubenswrapper[4841]: E1203 17:55:51.166206 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7\": container with ID starting with a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7 not found: ID does not exist" containerID="a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.166248 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7"} err="failed to get container status \"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7\": rpc error: code = NotFound desc = could not find container \"a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7\": container with ID starting with a31cfa5e4076e664984ce8029667f2b300bd46cfb44f5fc8cd3e40f7cccaafa7 not found: ID does not exist" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.166276 4841 scope.go:117] "RemoveContainer" containerID="99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98" Dec 03 17:55:51 crc kubenswrapper[4841]: E1203 17:55:51.166546 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98\": container with ID starting with 99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98 not found: ID does not exist" containerID="99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.166574 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98"} err="failed to get container status \"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98\": rpc error: code = NotFound desc = could not find container \"99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98\": container with ID starting with 99d7a94e8f7cde48dc033c05cf2a09aee11b39a58b0d57da2c619d94826edd98 not found: ID does not exist" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.166593 4841 scope.go:117] "RemoveContainer" containerID="ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767" Dec 03 17:55:51 crc kubenswrapper[4841]: E1203 17:55:51.169167 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767\": container with ID starting with ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767 not found: ID does not exist" containerID="ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767" Dec 03 17:55:51 crc kubenswrapper[4841]: I1203 17:55:51.169198 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767"} err="failed to get container status \"ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767\": rpc error: code = NotFound desc = could not find container \"ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767\": container with ID starting with ce9483a3c81249c05ab7ca14d1a5b19cc09f17ba768957bf10b92f7d29fb8767 not found: ID does not exist" Dec 03 17:55:52 crc kubenswrapper[4841]: I1203 17:55:52.253986 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" path="/var/lib/kubelet/pods/f7772106-f808-4f8d-88a5-8d92ee2fa3cb/volumes" Dec 03 17:55:54 crc kubenswrapper[4841]: I1203 17:55:54.972094 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-khncz" podUID="38a95f1c-87ae-4464-b6fa-ad329d17290e" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.78:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.882292 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:55:57 crc kubenswrapper[4841]: E1203 17:55:57.883871 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="extract-content" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.883901 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="extract-content" Dec 03 17:55:57 crc kubenswrapper[4841]: E1203 17:55:57.883993 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="registry-server" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.884007 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="registry-server" Dec 03 17:55:57 crc kubenswrapper[4841]: E1203 17:55:57.884042 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="extract-utilities" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.884057 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="extract-utilities" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.884447 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7772106-f808-4f8d-88a5-8d92ee2fa3cb" containerName="registry-server" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.887235 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.901662 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gr6z\" (UniqueName: \"kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.901759 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.902438 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:57 crc kubenswrapper[4841]: I1203 17:55:57.903396 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.004088 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.004362 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.004450 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gr6z\" (UniqueName: \"kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.004899 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.005701 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.033063 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gr6z\" (UniqueName: \"kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z\") pod \"certified-operators-ctb8r\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.230598 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:55:58 crc kubenswrapper[4841]: I1203 17:55:58.736265 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:55:59 crc kubenswrapper[4841]: I1203 17:55:59.091134 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerID="f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4" exitCode=0 Dec 03 17:55:59 crc kubenswrapper[4841]: I1203 17:55:59.091403 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerDied","Data":"f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4"} Dec 03 17:55:59 crc kubenswrapper[4841]: I1203 17:55:59.091432 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerStarted","Data":"757487eb3fc73a66d933580e7cbb362a91cb4b01c6e9d3c73c2c152b7e555a0a"} Dec 03 17:56:00 crc kubenswrapper[4841]: I1203 17:56:00.110176 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerStarted","Data":"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654"} Dec 03 17:56:01 crc kubenswrapper[4841]: I1203 17:56:01.125609 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerID="47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654" exitCode=0 Dec 03 17:56:01 crc kubenswrapper[4841]: I1203 17:56:01.125660 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerDied","Data":"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654"} Dec 03 17:56:02 crc kubenswrapper[4841]: I1203 17:56:02.139575 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerStarted","Data":"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738"} Dec 03 17:56:02 crc kubenswrapper[4841]: I1203 17:56:02.165187 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ctb8r" podStartSLOduration=2.485648177 podStartE2EDuration="5.165155278s" podCreationTimestamp="2025-12-03 17:55:57 +0000 UTC" firstStartedPulling="2025-12-03 17:55:59.092945371 +0000 UTC m=+3353.480466098" lastFinishedPulling="2025-12-03 17:56:01.772452442 +0000 UTC m=+3356.159973199" observedRunningTime="2025-12-03 17:56:02.164662825 +0000 UTC m=+3356.552183562" watchObservedRunningTime="2025-12-03 17:56:02.165155278 +0000 UTC m=+3356.552676055" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.461515 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.465652 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.478480 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.570887 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.571068 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czhtn\" (UniqueName: \"kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.571562 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.674252 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.674394 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.674425 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czhtn\" (UniqueName: \"kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.674738 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.674805 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.693602 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czhtn\" (UniqueName: \"kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn\") pod \"redhat-operators-wxh7z\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:03 crc kubenswrapper[4841]: I1203 17:56:03.798746 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:04 crc kubenswrapper[4841]: I1203 17:56:04.288150 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:04 crc kubenswrapper[4841]: W1203 17:56:04.292980 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58f6bf6b_0a20_433b_847e_8171f0c170a0.slice/crio-4bb433e8746876e03497c99f838f662458fb5b5accf90de6ca68a59c20c12d28 WatchSource:0}: Error finding container 4bb433e8746876e03497c99f838f662458fb5b5accf90de6ca68a59c20c12d28: Status 404 returned error can't find the container with id 4bb433e8746876e03497c99f838f662458fb5b5accf90de6ca68a59c20c12d28 Dec 03 17:56:04 crc kubenswrapper[4841]: E1203 17:56:04.787985 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58f6bf6b_0a20_433b_847e_8171f0c170a0.slice/crio-547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629.scope\": RecentStats: unable to find data in memory cache]" Dec 03 17:56:05 crc kubenswrapper[4841]: I1203 17:56:05.179707 4841 generic.go:334] "Generic (PLEG): container finished" podID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerID="547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629" exitCode=0 Dec 03 17:56:05 crc kubenswrapper[4841]: I1203 17:56:05.179773 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerDied","Data":"547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629"} Dec 03 17:56:05 crc kubenswrapper[4841]: I1203 17:56:05.179827 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerStarted","Data":"4bb433e8746876e03497c99f838f662458fb5b5accf90de6ca68a59c20c12d28"} Dec 03 17:56:06 crc kubenswrapper[4841]: I1203 17:56:06.195591 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerStarted","Data":"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde"} Dec 03 17:56:08 crc kubenswrapper[4841]: I1203 17:56:08.231550 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:08 crc kubenswrapper[4841]: I1203 17:56:08.231813 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:08 crc kubenswrapper[4841]: I1203 17:56:08.313885 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:09 crc kubenswrapper[4841]: I1203 17:56:09.321414 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:10 crc kubenswrapper[4841]: I1203 17:56:10.258550 4841 generic.go:334] "Generic (PLEG): container finished" podID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerID="03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde" exitCode=0 Dec 03 17:56:10 crc kubenswrapper[4841]: I1203 17:56:10.275546 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerDied","Data":"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde"} Dec 03 17:56:10 crc kubenswrapper[4841]: I1203 17:56:10.455744 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.273295 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerStarted","Data":"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c"} Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.273596 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ctb8r" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="registry-server" containerID="cri-o://bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738" gracePeriod=2 Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.299791 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wxh7z" podStartSLOduration=2.818790856 podStartE2EDuration="8.299771757s" podCreationTimestamp="2025-12-03 17:56:03 +0000 UTC" firstStartedPulling="2025-12-03 17:56:05.183072845 +0000 UTC m=+3359.570593592" lastFinishedPulling="2025-12-03 17:56:10.664053766 +0000 UTC m=+3365.051574493" observedRunningTime="2025-12-03 17:56:11.292755044 +0000 UTC m=+3365.680275831" watchObservedRunningTime="2025-12-03 17:56:11.299771757 +0000 UTC m=+3365.687292494" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.836611 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.862171 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities\") pod \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.862260 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content\") pod \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.862322 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7gr6z\" (UniqueName: \"kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z\") pod \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\" (UID: \"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca\") " Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.868549 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities" (OuterVolumeSpecName: "utilities") pod "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" (UID: "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.891527 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z" (OuterVolumeSpecName: "kube-api-access-7gr6z") pod "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" (UID: "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca"). InnerVolumeSpecName "kube-api-access-7gr6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.936061 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" (UID: "ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.964455 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.964489 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7gr6z\" (UniqueName: \"kubernetes.io/projected/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-kube-api-access-7gr6z\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:11 crc kubenswrapper[4841]: I1203 17:56:11.964499 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.286658 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerID="bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738" exitCode=0 Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.286729 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerDied","Data":"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738"} Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.286776 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ctb8r" event={"ID":"ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca","Type":"ContainerDied","Data":"757487eb3fc73a66d933580e7cbb362a91cb4b01c6e9d3c73c2c152b7e555a0a"} Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.286812 4841 scope.go:117] "RemoveContainer" containerID="bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.287108 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ctb8r" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.319696 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.326249 4841 scope.go:117] "RemoveContainer" containerID="47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.328738 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ctb8r"] Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.348638 4841 scope.go:117] "RemoveContainer" containerID="f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.409393 4841 scope.go:117] "RemoveContainer" containerID="bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738" Dec 03 17:56:12 crc kubenswrapper[4841]: E1203 17:56:12.410316 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738\": container with ID starting with bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738 not found: ID does not exist" containerID="bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.410362 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738"} err="failed to get container status \"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738\": rpc error: code = NotFound desc = could not find container \"bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738\": container with ID starting with bf5d2a097990580574d4a92acfc1882b1bd2bc0966ec1a70fac33da8fc43c738 not found: ID does not exist" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.410388 4841 scope.go:117] "RemoveContainer" containerID="47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654" Dec 03 17:56:12 crc kubenswrapper[4841]: E1203 17:56:12.413948 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654\": container with ID starting with 47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654 not found: ID does not exist" containerID="47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.413981 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654"} err="failed to get container status \"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654\": rpc error: code = NotFound desc = could not find container \"47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654\": container with ID starting with 47a28cb0c77a0b97c093734d897475f681d5e2d6a884364143ed0a27ffcc0654 not found: ID does not exist" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.414007 4841 scope.go:117] "RemoveContainer" containerID="f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4" Dec 03 17:56:12 crc kubenswrapper[4841]: E1203 17:56:12.414632 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4\": container with ID starting with f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4 not found: ID does not exist" containerID="f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4" Dec 03 17:56:12 crc kubenswrapper[4841]: I1203 17:56:12.414681 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4"} err="failed to get container status \"f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4\": rpc error: code = NotFound desc = could not find container \"f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4\": container with ID starting with f1db54698d5ee5d3f8ece47ed92dc6fa62c2435ee1f0c1c2674e148932208ec4 not found: ID does not exist" Dec 03 17:56:13 crc kubenswrapper[4841]: I1203 17:56:13.799278 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:13 crc kubenswrapper[4841]: I1203 17:56:13.799652 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:14 crc kubenswrapper[4841]: I1203 17:56:14.251045 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" path="/var/lib/kubelet/pods/ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca/volumes" Dec 03 17:56:14 crc kubenswrapper[4841]: I1203 17:56:14.888889 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wxh7z" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="registry-server" probeResult="failure" output=< Dec 03 17:56:14 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 17:56:14 crc kubenswrapper[4841]: > Dec 03 17:56:23 crc kubenswrapper[4841]: I1203 17:56:23.866443 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:23 crc kubenswrapper[4841]: I1203 17:56:23.943375 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:24 crc kubenswrapper[4841]: I1203 17:56:24.120085 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:25 crc kubenswrapper[4841]: I1203 17:56:25.447447 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wxh7z" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="registry-server" containerID="cri-o://1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c" gracePeriod=2 Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.002787 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.195746 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities\") pod \"58f6bf6b-0a20-433b-847e-8171f0c170a0\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.195983 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content\") pod \"58f6bf6b-0a20-433b-847e-8171f0c170a0\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.196125 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czhtn\" (UniqueName: \"kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn\") pod \"58f6bf6b-0a20-433b-847e-8171f0c170a0\" (UID: \"58f6bf6b-0a20-433b-847e-8171f0c170a0\") " Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.196816 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities" (OuterVolumeSpecName: "utilities") pod "58f6bf6b-0a20-433b-847e-8171f0c170a0" (UID: "58f6bf6b-0a20-433b-847e-8171f0c170a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.196985 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.205001 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn" (OuterVolumeSpecName: "kube-api-access-czhtn") pod "58f6bf6b-0a20-433b-847e-8171f0c170a0" (UID: "58f6bf6b-0a20-433b-847e-8171f0c170a0"). InnerVolumeSpecName "kube-api-access-czhtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.298743 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czhtn\" (UniqueName: \"kubernetes.io/projected/58f6bf6b-0a20-433b-847e-8171f0c170a0-kube-api-access-czhtn\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.345761 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58f6bf6b-0a20-433b-847e-8171f0c170a0" (UID: "58f6bf6b-0a20-433b-847e-8171f0c170a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.401164 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58f6bf6b-0a20-433b-847e-8171f0c170a0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.461185 4841 generic.go:334] "Generic (PLEG): container finished" podID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerID="1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c" exitCode=0 Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.461226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerDied","Data":"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c"} Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.461249 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wxh7z" event={"ID":"58f6bf6b-0a20-433b-847e-8171f0c170a0","Type":"ContainerDied","Data":"4bb433e8746876e03497c99f838f662458fb5b5accf90de6ca68a59c20c12d28"} Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.461268 4841 scope.go:117] "RemoveContainer" containerID="1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.463512 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wxh7z" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.499172 4841 scope.go:117] "RemoveContainer" containerID="03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.503063 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.511716 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wxh7z"] Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.533358 4841 scope.go:117] "RemoveContainer" containerID="547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.595256 4841 scope.go:117] "RemoveContainer" containerID="1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c" Dec 03 17:56:26 crc kubenswrapper[4841]: E1203 17:56:26.595829 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c\": container with ID starting with 1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c not found: ID does not exist" containerID="1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.595870 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c"} err="failed to get container status \"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c\": rpc error: code = NotFound desc = could not find container \"1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c\": container with ID starting with 1223c992898beb4f041da212858277b26d41a1a0280287b0a3d45772b624061c not found: ID does not exist" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.595899 4841 scope.go:117] "RemoveContainer" containerID="03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde" Dec 03 17:56:26 crc kubenswrapper[4841]: E1203 17:56:26.596314 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde\": container with ID starting with 03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde not found: ID does not exist" containerID="03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.596353 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde"} err="failed to get container status \"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde\": rpc error: code = NotFound desc = could not find container \"03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde\": container with ID starting with 03bb1e9920f265387b0e2619241629a5a1824c808a1091a105a3a1d6ed007dde not found: ID does not exist" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.596380 4841 scope.go:117] "RemoveContainer" containerID="547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629" Dec 03 17:56:26 crc kubenswrapper[4841]: E1203 17:56:26.596709 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629\": container with ID starting with 547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629 not found: ID does not exist" containerID="547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629" Dec 03 17:56:26 crc kubenswrapper[4841]: I1203 17:56:26.596735 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629"} err="failed to get container status \"547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629\": rpc error: code = NotFound desc = could not find container \"547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629\": container with ID starting with 547981bdccdb323cfd9feb7d42dabdac45ec62ed4a4897b5ad0815470942d629 not found: ID does not exist" Dec 03 17:56:28 crc kubenswrapper[4841]: I1203 17:56:28.252518 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" path="/var/lib/kubelet/pods/58f6bf6b-0a20-433b-847e-8171f0c170a0/volumes" Dec 03 17:57:39 crc kubenswrapper[4841]: I1203 17:57:39.316671 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:57:39 crc kubenswrapper[4841]: I1203 17:57:39.317169 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:57:53 crc kubenswrapper[4841]: I1203 17:57:53.452317 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 17:57:55 crc kubenswrapper[4841]: I1203 17:57:55.490552 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:55 crc kubenswrapper[4841]: I1203 17:57:55.492337 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="prometheus" containerID="cri-o://27109f5910226ee377a1094a5c3273443f2c273b4d853b4378f27b347f25c7cc" gracePeriod=600 Dec 03 17:57:55 crc kubenswrapper[4841]: I1203 17:57:55.492393 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="thanos-sidecar" containerID="cri-o://f846d9bb3ec456d8b2d776551f93d3f06b4b586b6dd35ea14a65e48617f1b835" gracePeriod=600 Dec 03 17:57:55 crc kubenswrapper[4841]: I1203 17:57:55.492429 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="config-reloader" containerID="cri-o://c874d8d6035533f3b70ba885e2e7152bb42998e2a90e71342dba8aebb97aa00d" gracePeriod=600 Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495360 4841 generic.go:334] "Generic (PLEG): container finished" podID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerID="f846d9bb3ec456d8b2d776551f93d3f06b4b586b6dd35ea14a65e48617f1b835" exitCode=0 Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495740 4841 generic.go:334] "Generic (PLEG): container finished" podID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerID="c874d8d6035533f3b70ba885e2e7152bb42998e2a90e71342dba8aebb97aa00d" exitCode=0 Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495753 4841 generic.go:334] "Generic (PLEG): container finished" podID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerID="27109f5910226ee377a1094a5c3273443f2c273b4d853b4378f27b347f25c7cc" exitCode=0 Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495807 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerDied","Data":"f846d9bb3ec456d8b2d776551f93d3f06b4b586b6dd35ea14a65e48617f1b835"} Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495843 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerDied","Data":"c874d8d6035533f3b70ba885e2e7152bb42998e2a90e71342dba8aebb97aa00d"} Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495855 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerDied","Data":"27109f5910226ee377a1094a5c3273443f2c273b4d853b4378f27b347f25c7cc"} Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495872 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"8d4bbe0b-bc34-4450-8ebd-7269e9473f02","Type":"ContainerDied","Data":"5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db"} Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.495886 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5735c810084ff4e2a29d01f41288da6c2de5eecb5e125599124df83c9ed742db" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.516723 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585573 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585661 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585716 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585766 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585820 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585835 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585899 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585953 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.585997 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.586039 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4ccr\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr\") pod \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\" (UID: \"8d4bbe0b-bc34-4450-8ebd-7269e9473f02\") " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.588656 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.593455 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.594320 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out" (OuterVolumeSpecName: "config-out") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.594359 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr" (OuterVolumeSpecName: "kube-api-access-d4ccr") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "kube-api-access-d4ccr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.594393 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.595393 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config" (OuterVolumeSpecName: "config") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.596555 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.603148 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.608971 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.609026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.687982 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688437 4841 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688449 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688462 4841 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688475 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688483 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688492 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688502 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4ccr\" (UniqueName: \"kubernetes.io/projected/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-kube-api-access-d4ccr\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688512 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.688521 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.690023 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config" (OuterVolumeSpecName: "web-config") pod "8d4bbe0b-bc34-4450-8ebd-7269e9473f02" (UID: "8d4bbe0b-bc34-4450-8ebd-7269e9473f02"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.713983 4841 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.790333 4841 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:56.790362 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/8d4bbe0b-bc34-4450-8ebd-7269e9473f02-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:57.516120 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:57.555724 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:57 crc kubenswrapper[4841]: I1203 17:57:57.570140 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.263501 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" path="/var/lib/kubelet/pods/8d4bbe0b-bc34-4450-8ebd-7269e9473f02/volumes" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.554963 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555477 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="extract-content" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555586 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="extract-content" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555611 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="config-reloader" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555620 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="config-reloader" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555647 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="prometheus" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555658 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="prometheus" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555675 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="init-config-reloader" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555685 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="init-config-reloader" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555704 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="extract-utilities" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555712 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="extract-utilities" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555726 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="extract-utilities" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555734 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="extract-utilities" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555753 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555761 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555778 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="thanos-sidecar" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555787 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="thanos-sidecar" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555804 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="extract-content" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555813 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="extract-content" Dec 03 17:57:58 crc kubenswrapper[4841]: E1203 17:57:58.555829 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.555837 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.556123 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="58f6bf6b-0a20-433b-847e-8171f0c170a0" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.556141 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="thanos-sidecar" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.556161 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="config-reloader" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.556178 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d4bbe0b-bc34-4450-8ebd-7269e9473f02" containerName="prometheus" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.556197 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab36985c-4a9d-440e-91cd-8c1c5fa4c5ca" containerName="registry-server" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.560163 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.563328 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.564523 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.564636 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.564732 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4mqjg" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.564855 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.574010 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.576541 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.578873 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616188 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616264 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616346 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616380 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616412 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616450 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xd5bt\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616475 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616499 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616544 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.616637 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720663 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720738 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720814 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720844 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720869 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720901 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xd5bt\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720945 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.720971 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.721007 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.721037 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.721064 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.723521 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.735668 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.740597 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.744683 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.747618 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.749841 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xd5bt\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.766218 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.773688 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.773730 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.774734 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.799691 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:58 crc kubenswrapper[4841]: I1203 17:57:58.893687 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 17:57:59 crc kubenswrapper[4841]: I1203 17:57:59.447876 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 17:57:59 crc kubenswrapper[4841]: I1203 17:57:59.560641 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerStarted","Data":"ff9a9b616e32337b7a497066f5d77dd6de4505771158126f3e57905be15c1d9a"} Dec 03 17:58:03 crc kubenswrapper[4841]: I1203 17:58:03.716304 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerStarted","Data":"1c5f75446cdbe4cd713f74fe99c03603156f3be26d3c09ebcffe149c7d245eb2"} Dec 03 17:58:09 crc kubenswrapper[4841]: I1203 17:58:09.317247 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:58:09 crc kubenswrapper[4841]: I1203 17:58:09.318298 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:58:11 crc kubenswrapper[4841]: I1203 17:58:11.860753 4841 generic.go:334] "Generic (PLEG): container finished" podID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerID="1c5f75446cdbe4cd713f74fe99c03603156f3be26d3c09ebcffe149c7d245eb2" exitCode=0 Dec 03 17:58:11 crc kubenswrapper[4841]: I1203 17:58:11.861226 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerDied","Data":"1c5f75446cdbe4cd713f74fe99c03603156f3be26d3c09ebcffe149c7d245eb2"} Dec 03 17:58:12 crc kubenswrapper[4841]: I1203 17:58:12.872632 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerStarted","Data":"cbb97066852a9f363dfdf3dcd3e2dc7ec1aaa30a4eca966e1b8d0380f30c2d27"} Dec 03 17:58:16 crc kubenswrapper[4841]: I1203 17:58:16.920212 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerStarted","Data":"64891335f24364fa91a90849d22a4b856632847545ff51648145a621cdfa9438"} Dec 03 17:58:16 crc kubenswrapper[4841]: I1203 17:58:16.921990 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerStarted","Data":"401ac88482de116750c86b28264b2a01507c752d7b7051078f6100509637c850"} Dec 03 17:58:16 crc kubenswrapper[4841]: I1203 17:58:16.947292 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=18.947272465 podStartE2EDuration="18.947272465s" podCreationTimestamp="2025-12-03 17:57:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 17:58:16.945703936 +0000 UTC m=+3491.333224673" watchObservedRunningTime="2025-12-03 17:58:16.947272465 +0000 UTC m=+3491.334793192" Dec 03 17:58:18 crc kubenswrapper[4841]: I1203 17:58:18.894364 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 17:58:28 crc kubenswrapper[4841]: I1203 17:58:28.894403 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 17:58:28 crc kubenswrapper[4841]: I1203 17:58:28.907349 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 17:58:29 crc kubenswrapper[4841]: I1203 17:58:29.100389 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 17:58:39 crc kubenswrapper[4841]: I1203 17:58:39.316708 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 17:58:39 crc kubenswrapper[4841]: I1203 17:58:39.317628 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 17:58:39 crc kubenswrapper[4841]: I1203 17:58:39.317708 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 17:58:39 crc kubenswrapper[4841]: I1203 17:58:39.318610 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 17:58:39 crc kubenswrapper[4841]: I1203 17:58:39.318746 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72" gracePeriod=600 Dec 03 17:58:40 crc kubenswrapper[4841]: I1203 17:58:40.205354 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72" exitCode=0 Dec 03 17:58:40 crc kubenswrapper[4841]: I1203 17:58:40.205403 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72"} Dec 03 17:58:40 crc kubenswrapper[4841]: I1203 17:58:40.205750 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195"} Dec 03 17:58:40 crc kubenswrapper[4841]: I1203 17:58:40.205778 4841 scope.go:117] "RemoveContainer" containerID="93f943eb31e565bcd10f3307a20298e4b2f1a13803505ae06aa0a4d778218d12" Dec 03 17:59:19 crc kubenswrapper[4841]: I1203 17:59:19.048772 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-64fc-account-create-update-brbfq"] Dec 03 17:59:19 crc kubenswrapper[4841]: I1203 17:59:19.062151 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-h97vq"] Dec 03 17:59:19 crc kubenswrapper[4841]: I1203 17:59:19.073239 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-h97vq"] Dec 03 17:59:19 crc kubenswrapper[4841]: I1203 17:59:19.085325 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-64fc-account-create-update-brbfq"] Dec 03 17:59:20 crc kubenswrapper[4841]: I1203 17:59:20.263281 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a95d12fa-977b-4ac5-ba27-5c17345449e0" path="/var/lib/kubelet/pods/a95d12fa-977b-4ac5-ba27-5c17345449e0/volumes" Dec 03 17:59:20 crc kubenswrapper[4841]: I1203 17:59:20.264703 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd67b927-0545-4a8a-a3e7-ac428b92ee76" path="/var/lib/kubelet/pods/dd67b927-0545-4a8a-a3e7-ac428b92ee76/volumes" Dec 03 17:59:25 crc kubenswrapper[4841]: I1203 17:59:25.921029 4841 scope.go:117] "RemoveContainer" containerID="920957badd9afe617bdc300abbb29b9950c42bfd57dc5242c8b38fd50110c097" Dec 03 17:59:25 crc kubenswrapper[4841]: I1203 17:59:25.963371 4841 scope.go:117] "RemoveContainer" containerID="a6e04772d785ec2a2f55c1fd40906bda2825ca2dc3c75e7a90034c978bf12f9e" Dec 03 17:59:30 crc kubenswrapper[4841]: I1203 17:59:30.054358 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-29tn9"] Dec 03 17:59:30 crc kubenswrapper[4841]: I1203 17:59:30.069322 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-29tn9"] Dec 03 17:59:30 crc kubenswrapper[4841]: I1203 17:59:30.251268 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2437c8d9-3720-48c6-ad74-252479515189" path="/var/lib/kubelet/pods/2437c8d9-3720-48c6-ad74-252479515189/volumes" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.090428 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.093618 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.104271 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.237993 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.238055 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64r7g\" (UniqueName: \"kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.238141 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.339785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.339834 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64r7g\" (UniqueName: \"kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.339884 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.340312 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.340369 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.363464 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64r7g\" (UniqueName: \"kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g\") pod \"redhat-marketplace-w7qvl\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.427744 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:46 crc kubenswrapper[4841]: I1203 17:59:46.950930 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 17:59:47 crc kubenswrapper[4841]: I1203 17:59:47.065709 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerStarted","Data":"8f414b71fff952c8672f59d960e9a836eab7b7ee85c26f37329baf59487765af"} Dec 03 17:59:48 crc kubenswrapper[4841]: I1203 17:59:48.082049 4841 generic.go:334] "Generic (PLEG): container finished" podID="d091e094-197b-430b-9189-56cc9ae2e40e" containerID="d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f" exitCode=0 Dec 03 17:59:48 crc kubenswrapper[4841]: I1203 17:59:48.082151 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerDied","Data":"d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f"} Dec 03 17:59:48 crc kubenswrapper[4841]: I1203 17:59:48.085304 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 17:59:49 crc kubenswrapper[4841]: I1203 17:59:49.100462 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerStarted","Data":"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97"} Dec 03 17:59:50 crc kubenswrapper[4841]: I1203 17:59:50.118973 4841 generic.go:334] "Generic (PLEG): container finished" podID="d091e094-197b-430b-9189-56cc9ae2e40e" containerID="104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97" exitCode=0 Dec 03 17:59:50 crc kubenswrapper[4841]: I1203 17:59:50.119052 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerDied","Data":"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97"} Dec 03 17:59:51 crc kubenswrapper[4841]: I1203 17:59:51.130400 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerStarted","Data":"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3"} Dec 03 17:59:51 crc kubenswrapper[4841]: I1203 17:59:51.153577 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w7qvl" podStartSLOduration=2.519160494 podStartE2EDuration="5.15355168s" podCreationTimestamp="2025-12-03 17:59:46 +0000 UTC" firstStartedPulling="2025-12-03 17:59:48.084934856 +0000 UTC m=+3582.472455623" lastFinishedPulling="2025-12-03 17:59:50.719326042 +0000 UTC m=+3585.106846809" observedRunningTime="2025-12-03 17:59:51.153226722 +0000 UTC m=+3585.540747459" watchObservedRunningTime="2025-12-03 17:59:51.15355168 +0000 UTC m=+3585.541072417" Dec 03 17:59:55 crc kubenswrapper[4841]: I1203 17:59:55.407844 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 17:59:56 crc kubenswrapper[4841]: I1203 17:59:56.428197 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:56 crc kubenswrapper[4841]: I1203 17:59:56.428745 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:56 crc kubenswrapper[4841]: I1203 17:59:56.503201 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.133598 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.133951 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-api" containerID="cri-o://5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db" gracePeriod=30 Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.134040 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-listener" containerID="cri-o://a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4" gracePeriod=30 Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.134072 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-notifier" containerID="cri-o://dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934" gracePeriod=30 Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.134086 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-evaluator" containerID="cri-o://6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa" gracePeriod=30 Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.245367 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:57 crc kubenswrapper[4841]: I1203 17:59:57.305568 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 17:59:58 crc kubenswrapper[4841]: I1203 17:59:58.207167 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab899add-1786-48e5-9190-25b2f51d6569" containerID="6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa" exitCode=0 Dec 03 17:59:58 crc kubenswrapper[4841]: I1203 17:59:58.207368 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab899add-1786-48e5-9190-25b2f51d6569" containerID="5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db" exitCode=0 Dec 03 17:59:58 crc kubenswrapper[4841]: I1203 17:59:58.207208 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerDied","Data":"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa"} Dec 03 17:59:58 crc kubenswrapper[4841]: I1203 17:59:58.207490 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerDied","Data":"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db"} Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.219532 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w7qvl" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="registry-server" containerID="cri-o://54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3" gracePeriod=2 Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.759108 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.871735 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities\") pod \"d091e094-197b-430b-9189-56cc9ae2e40e\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.872116 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content\") pod \"d091e094-197b-430b-9189-56cc9ae2e40e\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.872195 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64r7g\" (UniqueName: \"kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g\") pod \"d091e094-197b-430b-9189-56cc9ae2e40e\" (UID: \"d091e094-197b-430b-9189-56cc9ae2e40e\") " Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.873026 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities" (OuterVolumeSpecName: "utilities") pod "d091e094-197b-430b-9189-56cc9ae2e40e" (UID: "d091e094-197b-430b-9189-56cc9ae2e40e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.879149 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g" (OuterVolumeSpecName: "kube-api-access-64r7g") pod "d091e094-197b-430b-9189-56cc9ae2e40e" (UID: "d091e094-197b-430b-9189-56cc9ae2e40e"). InnerVolumeSpecName "kube-api-access-64r7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.889508 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d091e094-197b-430b-9189-56cc9ae2e40e" (UID: "d091e094-197b-430b-9189-56cc9ae2e40e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.974587 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.974618 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64r7g\" (UniqueName: \"kubernetes.io/projected/d091e094-197b-430b-9189-56cc9ae2e40e-kube-api-access-64r7g\") on node \"crc\" DevicePath \"\"" Dec 03 17:59:59 crc kubenswrapper[4841]: I1203 17:59:59.974631 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d091e094-197b-430b-9189-56cc9ae2e40e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.165012 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp"] Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.165445 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="registry-server" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.165466 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="registry-server" Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.165491 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="extract-content" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.165497 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="extract-content" Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.165528 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="extract-utilities" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.165535 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="extract-utilities" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.165730 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" containerName="registry-server" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.166547 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.169858 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.172098 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.197184 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp"] Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.239196 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab899add-1786-48e5-9190-25b2f51d6569" containerID="dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934" exitCode=0 Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.239299 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerDied","Data":"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934"} Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.248890 4841 generic.go:334] "Generic (PLEG): container finished" podID="d091e094-197b-430b-9189-56cc9ae2e40e" containerID="54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3" exitCode=0 Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.249123 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w7qvl" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.263398 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerDied","Data":"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3"} Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.263458 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w7qvl" event={"ID":"d091e094-197b-430b-9189-56cc9ae2e40e","Type":"ContainerDied","Data":"8f414b71fff952c8672f59d960e9a836eab7b7ee85c26f37329baf59487765af"} Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.263494 4841 scope.go:117] "RemoveContainer" containerID="54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.281539 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sqtv\" (UniqueName: \"kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.281610 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.281686 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.298200 4841 scope.go:117] "RemoveContainer" containerID="104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.298440 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.310691 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w7qvl"] Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.321152 4841 scope.go:117] "RemoveContainer" containerID="d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.351013 4841 scope.go:117] "RemoveContainer" containerID="54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3" Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.351501 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3\": container with ID starting with 54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3 not found: ID does not exist" containerID="54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.351546 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3"} err="failed to get container status \"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3\": rpc error: code = NotFound desc = could not find container \"54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3\": container with ID starting with 54a3c98673bab52d34d2b3c7fea6d2b1c1007b40d87bea505c4769b8bc6f98f3 not found: ID does not exist" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.351575 4841 scope.go:117] "RemoveContainer" containerID="104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97" Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.351946 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97\": container with ID starting with 104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97 not found: ID does not exist" containerID="104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.351978 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97"} err="failed to get container status \"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97\": rpc error: code = NotFound desc = could not find container \"104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97\": container with ID starting with 104501876fd2c1a36854d241fd99a2e9cae06b6fe6cb7c9d890b870900496c97 not found: ID does not exist" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.352001 4841 scope.go:117] "RemoveContainer" containerID="d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f" Dec 03 18:00:00 crc kubenswrapper[4841]: E1203 18:00:00.352401 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f\": container with ID starting with d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f not found: ID does not exist" containerID="d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.352420 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f"} err="failed to get container status \"d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f\": rpc error: code = NotFound desc = could not find container \"d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f\": container with ID starting with d562469945a4b9720a1b123875d595ebbf939802aaffe8566853d65e42ce838f not found: ID does not exist" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.383801 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sqtv\" (UniqueName: \"kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.383967 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.384107 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.385724 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.389592 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.421604 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sqtv\" (UniqueName: \"kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv\") pod \"collect-profiles-29413080-822xp\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.486361 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.864054 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp"] Dec 03 18:00:00 crc kubenswrapper[4841]: I1203 18:00:00.969755 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.105575 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.106002 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.106672 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.106931 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.107139 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66c7p\" (UniqueName: \"kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.107369 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.112973 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts" (OuterVolumeSpecName: "scripts") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.113436 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p" (OuterVolumeSpecName: "kube-api-access-66c7p") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "kube-api-access-66c7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.161698 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.167643 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.205344 4841 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data podName:ab899add-1786-48e5-9190-25b2f51d6569 nodeName:}" failed. No retries permitted until 2025-12-03 18:00:01.705309566 +0000 UTC m=+3596.092830293 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569") : error deleting /var/lib/kubelet/pods/ab899add-1786-48e5-9190-25b2f51d6569/volume-subpaths: remove /var/lib/kubelet/pods/ab899add-1786-48e5-9190-25b2f51d6569/volume-subpaths: no such file or directory Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.208473 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.209887 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: W1203 18:00:01.210139 4841 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/ab899add-1786-48e5-9190-25b2f51d6569/volumes/kubernetes.io~secret/combined-ca-bundle Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210228 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210516 4841 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210557 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66c7p\" (UniqueName: \"kubernetes.io/projected/ab899add-1786-48e5-9190-25b2f51d6569-kube-api-access-66c7p\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210569 4841 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210577 4841 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.210585 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.264345 4841 generic.go:334] "Generic (PLEG): container finished" podID="ab899add-1786-48e5-9190-25b2f51d6569" containerID="a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4" exitCode=0 Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.264414 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.264429 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerDied","Data":"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4"} Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.264456 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"ab899add-1786-48e5-9190-25b2f51d6569","Type":"ContainerDied","Data":"b7fbbd740d8e07d018391d1ac9d11345274df53b479e0456187df3a3d875a559"} Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.264472 4841 scope.go:117] "RemoveContainer" containerID="a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.266220 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" event={"ID":"45fca2b0-0152-4799-a00f-f314a5b959a5","Type":"ContainerStarted","Data":"c8de292474f38ed865451eb4002aec9ce4e74b69e6dabf3a5dac8ef15f2d688f"} Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.266246 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" event={"ID":"45fca2b0-0152-4799-a00f-f314a5b959a5","Type":"ContainerStarted","Data":"f9f3e5ef0918e0b34d91753e30b0d9f662056b6de2cc44cc4d237f4b936498b6"} Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.284184 4841 scope.go:117] "RemoveContainer" containerID="dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.291722 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" podStartSLOduration=1.291706668 podStartE2EDuration="1.291706668s" podCreationTimestamp="2025-12-03 18:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:00:01.289301449 +0000 UTC m=+3595.676822176" watchObservedRunningTime="2025-12-03 18:00:01.291706668 +0000 UTC m=+3595.679227395" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.314675 4841 scope.go:117] "RemoveContainer" containerID="6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.349472 4841 scope.go:117] "RemoveContainer" containerID="5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.371995 4841 scope.go:117] "RemoveContainer" containerID="a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.372776 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4\": container with ID starting with a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4 not found: ID does not exist" containerID="a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.372877 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4"} err="failed to get container status \"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4\": rpc error: code = NotFound desc = could not find container \"a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4\": container with ID starting with a4e3711a5f765b4ea0375649065e5ddf696f084745d5a5fa5ac5acc7073924f4 not found: ID does not exist" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.372938 4841 scope.go:117] "RemoveContainer" containerID="dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.373454 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934\": container with ID starting with dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934 not found: ID does not exist" containerID="dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.373488 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934"} err="failed to get container status \"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934\": rpc error: code = NotFound desc = could not find container \"dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934\": container with ID starting with dbb79a4b1b45f5c33ad288f912dd9a0814b79b702ba50a62ef464620fc8db934 not found: ID does not exist" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.373513 4841 scope.go:117] "RemoveContainer" containerID="6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.374019 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa\": container with ID starting with 6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa not found: ID does not exist" containerID="6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.374065 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa"} err="failed to get container status \"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa\": rpc error: code = NotFound desc = could not find container \"6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa\": container with ID starting with 6972f9185dc7da81a6f1a6ada241c1d3307e74808f9ef4812cc83618e4dc56aa not found: ID does not exist" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.374094 4841 scope.go:117] "RemoveContainer" containerID="5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.374435 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db\": container with ID starting with 5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db not found: ID does not exist" containerID="5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.374452 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db"} err="failed to get container status \"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db\": rpc error: code = NotFound desc = could not find container \"5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db\": container with ID starting with 5a9b30aaaf6d31f64c5d87646f5b5009d8bf729ebb9190634e1c8c6e6fa6d1db not found: ID does not exist" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.743423 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") pod \"ab899add-1786-48e5-9190-25b2f51d6569\" (UID: \"ab899add-1786-48e5-9190-25b2f51d6569\") " Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.750280 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data" (OuterVolumeSpecName: "config-data") pod "ab899add-1786-48e5-9190-25b2f51d6569" (UID: "ab899add-1786-48e5-9190-25b2f51d6569"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.847089 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ab899add-1786-48e5-9190-25b2f51d6569-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.912607 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.960008 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.993086 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.993881 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-api" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994093 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-api" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.994120 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-notifier" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994132 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-notifier" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.994161 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-listener" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994170 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-listener" Dec 03 18:00:01 crc kubenswrapper[4841]: E1203 18:00:01.994198 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-evaluator" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994206 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-evaluator" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994621 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-listener" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994642 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-notifier" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994657 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-evaluator" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.994672 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab899add-1786-48e5-9190-25b2f51d6569" containerName="aodh-api" Dec 03 18:00:01 crc kubenswrapper[4841]: I1203 18:00:01.997991 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.007933 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.008185 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.008294 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-9f96b" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.008317 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.008470 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.010101 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.155876 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.156629 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-public-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.156950 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-psgrw\" (UniqueName: \"kubernetes.io/projected/6ece35e7-8cb6-4585-be80-cb4a526af861-kube-api-access-psgrw\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.157071 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-scripts\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.157170 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-internal-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.157217 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-config-data\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.255308 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab899add-1786-48e5-9190-25b2f51d6569" path="/var/lib/kubelet/pods/ab899add-1786-48e5-9190-25b2f51d6569/volumes" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.256360 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d091e094-197b-430b-9189-56cc9ae2e40e" path="/var/lib/kubelet/pods/d091e094-197b-430b-9189-56cc9ae2e40e/volumes" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.258794 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-internal-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.258932 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-config-data\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.259052 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.259123 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-public-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.259240 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-psgrw\" (UniqueName: \"kubernetes.io/projected/6ece35e7-8cb6-4585-be80-cb4a526af861-kube-api-access-psgrw\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.259322 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-scripts\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.272559 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-public-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.272755 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-scripts\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.273103 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.273217 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-config-data\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.280028 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ece35e7-8cb6-4585-be80-cb4a526af861-internal-tls-certs\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.287670 4841 generic.go:334] "Generic (PLEG): container finished" podID="45fca2b0-0152-4799-a00f-f314a5b959a5" containerID="c8de292474f38ed865451eb4002aec9ce4e74b69e6dabf3a5dac8ef15f2d688f" exitCode=0 Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.288162 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" event={"ID":"45fca2b0-0152-4799-a00f-f314a5b959a5","Type":"ContainerDied","Data":"c8de292474f38ed865451eb4002aec9ce4e74b69e6dabf3a5dac8ef15f2d688f"} Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.289656 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-psgrw\" (UniqueName: \"kubernetes.io/projected/6ece35e7-8cb6-4585-be80-cb4a526af861-kube-api-access-psgrw\") pod \"aodh-0\" (UID: \"6ece35e7-8cb6-4585-be80-cb4a526af861\") " pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.318445 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 18:00:02 crc kubenswrapper[4841]: W1203 18:00:02.807825 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ece35e7_8cb6_4585_be80_cb4a526af861.slice/crio-0a7947251c84e64651e496dcea6dbac7527bdee269d6712531acd2a3ae1a2d1c WatchSource:0}: Error finding container 0a7947251c84e64651e496dcea6dbac7527bdee269d6712531acd2a3ae1a2d1c: Status 404 returned error can't find the container with id 0a7947251c84e64651e496dcea6dbac7527bdee269d6712531acd2a3ae1a2d1c Dec 03 18:00:02 crc kubenswrapper[4841]: I1203 18:00:02.807964 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.305712 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6ece35e7-8cb6-4585-be80-cb4a526af861","Type":"ContainerStarted","Data":"0a7947251c84e64651e496dcea6dbac7527bdee269d6712531acd2a3ae1a2d1c"} Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.743310 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.769081 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sqtv\" (UniqueName: \"kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv\") pod \"45fca2b0-0152-4799-a00f-f314a5b959a5\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.769568 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume\") pod \"45fca2b0-0152-4799-a00f-f314a5b959a5\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.769988 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume\") pod \"45fca2b0-0152-4799-a00f-f314a5b959a5\" (UID: \"45fca2b0-0152-4799-a00f-f314a5b959a5\") " Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.771053 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume" (OuterVolumeSpecName: "config-volume") pod "45fca2b0-0152-4799-a00f-f314a5b959a5" (UID: "45fca2b0-0152-4799-a00f-f314a5b959a5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.774097 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv" (OuterVolumeSpecName: "kube-api-access-5sqtv") pod "45fca2b0-0152-4799-a00f-f314a5b959a5" (UID: "45fca2b0-0152-4799-a00f-f314a5b959a5"). InnerVolumeSpecName "kube-api-access-5sqtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.777176 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "45fca2b0-0152-4799-a00f-f314a5b959a5" (UID: "45fca2b0-0152-4799-a00f-f314a5b959a5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.873058 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5sqtv\" (UniqueName: \"kubernetes.io/projected/45fca2b0-0152-4799-a00f-f314a5b959a5-kube-api-access-5sqtv\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.873106 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/45fca2b0-0152-4799-a00f-f314a5b959a5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:03 crc kubenswrapper[4841]: I1203 18:00:03.873126 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/45fca2b0-0152-4799-a00f-f314a5b959a5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.389427 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.391250 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413080-822xp" event={"ID":"45fca2b0-0152-4799-a00f-f314a5b959a5","Type":"ContainerDied","Data":"f9f3e5ef0918e0b34d91753e30b0d9f662056b6de2cc44cc4d237f4b936498b6"} Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.391306 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9f3e5ef0918e0b34d91753e30b0d9f662056b6de2cc44cc4d237f4b936498b6" Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.393384 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6ece35e7-8cb6-4585-be80-cb4a526af861","Type":"ContainerStarted","Data":"019c83940e1a0529c83a5381bf44f7efbf8bf629011ee26e5d4a0973f1801edc"} Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.434697 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt"] Dec 03 18:00:04 crc kubenswrapper[4841]: I1203 18:00:04.442735 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413035-w5flt"] Dec 03 18:00:05 crc kubenswrapper[4841]: I1203 18:00:05.408370 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6ece35e7-8cb6-4585-be80-cb4a526af861","Type":"ContainerStarted","Data":"0c8fbf9dc88f841d0c5d19c3c5867b505168e7c05a3f490873347cd493164dd9"} Dec 03 18:00:06 crc kubenswrapper[4841]: I1203 18:00:06.263187 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="480e0b95-5535-4d4b-9261-8b0e5ea621ef" path="/var/lib/kubelet/pods/480e0b95-5535-4d4b-9261-8b0e5ea621ef/volumes" Dec 03 18:00:06 crc kubenswrapper[4841]: I1203 18:00:06.430503 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6ece35e7-8cb6-4585-be80-cb4a526af861","Type":"ContainerStarted","Data":"424050e82b41d9fd4fcb0ecec4f27fe222a4c3cda0f13d7512765a552a15a483"} Dec 03 18:00:06 crc kubenswrapper[4841]: I1203 18:00:06.430558 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6ece35e7-8cb6-4585-be80-cb4a526af861","Type":"ContainerStarted","Data":"fd299084fae72be582b29a02bf3510008d07f57985b8f2d3060bb4df9aa2904c"} Dec 03 18:00:06 crc kubenswrapper[4841]: I1203 18:00:06.457626 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.299619562 podStartE2EDuration="5.457610292s" podCreationTimestamp="2025-12-03 18:00:01 +0000 UTC" firstStartedPulling="2025-12-03 18:00:02.810486458 +0000 UTC m=+3597.198007185" lastFinishedPulling="2025-12-03 18:00:05.968477148 +0000 UTC m=+3600.355997915" observedRunningTime="2025-12-03 18:00:06.450415394 +0000 UTC m=+3600.837936121" watchObservedRunningTime="2025-12-03 18:00:06.457610292 +0000 UTC m=+3600.845131019" Dec 03 18:00:26 crc kubenswrapper[4841]: I1203 18:00:26.108262 4841 scope.go:117] "RemoveContainer" containerID="19201c2b3a5aa5b4fec97f0b84eb404e483bfc55d6dbc47b93a926c59973a213" Dec 03 18:00:26 crc kubenswrapper[4841]: I1203 18:00:26.160704 4841 scope.go:117] "RemoveContainer" containerID="3f24b38f44e89dfddf7a8a2544c0c693096c9ebb32ea34ba65f29e96d204b920" Dec 03 18:00:39 crc kubenswrapper[4841]: I1203 18:00:39.317126 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:00:39 crc kubenswrapper[4841]: I1203 18:00:39.317751 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.192651 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29413081-tmxhg"] Dec 03 18:01:00 crc kubenswrapper[4841]: E1203 18:01:00.194726 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45fca2b0-0152-4799-a00f-f314a5b959a5" containerName="collect-profiles" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.194765 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="45fca2b0-0152-4799-a00f-f314a5b959a5" containerName="collect-profiles" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.195487 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="45fca2b0-0152-4799-a00f-f314a5b959a5" containerName="collect-profiles" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.197385 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.205298 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413081-tmxhg"] Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.374005 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dzds\" (UniqueName: \"kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.374497 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.374586 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.374626 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.476755 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.476852 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.476919 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.477291 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dzds\" (UniqueName: \"kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.488186 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.488360 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.495525 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.498572 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dzds\" (UniqueName: \"kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds\") pod \"keystone-cron-29413081-tmxhg\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:00 crc kubenswrapper[4841]: I1203 18:01:00.534011 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:01 crc kubenswrapper[4841]: I1203 18:01:01.071691 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29413081-tmxhg"] Dec 03 18:01:01 crc kubenswrapper[4841]: I1203 18:01:01.112572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413081-tmxhg" event={"ID":"a19d874c-b175-4268-87ae-bec2516b1e1a","Type":"ContainerStarted","Data":"d45956fbbbaa7bad93b652bb2083a5c145969863c1cc8486a30ede6904f85ebb"} Dec 03 18:01:02 crc kubenswrapper[4841]: I1203 18:01:02.131493 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413081-tmxhg" event={"ID":"a19d874c-b175-4268-87ae-bec2516b1e1a","Type":"ContainerStarted","Data":"27673071be9cb412729be5925ed9398d725a2b8b564f7b519c24bfad196e3e72"} Dec 03 18:01:02 crc kubenswrapper[4841]: I1203 18:01:02.162670 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29413081-tmxhg" podStartSLOduration=2.162627135 podStartE2EDuration="2.162627135s" podCreationTimestamp="2025-12-03 18:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:01:02.1559352 +0000 UTC m=+3656.543455957" watchObservedRunningTime="2025-12-03 18:01:02.162627135 +0000 UTC m=+3656.550147872" Dec 03 18:01:04 crc kubenswrapper[4841]: I1203 18:01:04.160218 4841 generic.go:334] "Generic (PLEG): container finished" podID="a19d874c-b175-4268-87ae-bec2516b1e1a" containerID="27673071be9cb412729be5925ed9398d725a2b8b564f7b519c24bfad196e3e72" exitCode=0 Dec 03 18:01:04 crc kubenswrapper[4841]: I1203 18:01:04.160306 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413081-tmxhg" event={"ID":"a19d874c-b175-4268-87ae-bec2516b1e1a","Type":"ContainerDied","Data":"27673071be9cb412729be5925ed9398d725a2b8b564f7b519c24bfad196e3e72"} Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.594213 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.792378 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle\") pod \"a19d874c-b175-4268-87ae-bec2516b1e1a\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.792500 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7dzds\" (UniqueName: \"kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds\") pod \"a19d874c-b175-4268-87ae-bec2516b1e1a\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.792718 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data\") pod \"a19d874c-b175-4268-87ae-bec2516b1e1a\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.792768 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys\") pod \"a19d874c-b175-4268-87ae-bec2516b1e1a\" (UID: \"a19d874c-b175-4268-87ae-bec2516b1e1a\") " Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.800492 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds" (OuterVolumeSpecName: "kube-api-access-7dzds") pod "a19d874c-b175-4268-87ae-bec2516b1e1a" (UID: "a19d874c-b175-4268-87ae-bec2516b1e1a"). InnerVolumeSpecName "kube-api-access-7dzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.800856 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a19d874c-b175-4268-87ae-bec2516b1e1a" (UID: "a19d874c-b175-4268-87ae-bec2516b1e1a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.852220 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a19d874c-b175-4268-87ae-bec2516b1e1a" (UID: "a19d874c-b175-4268-87ae-bec2516b1e1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.876591 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data" (OuterVolumeSpecName: "config-data") pod "a19d874c-b175-4268-87ae-bec2516b1e1a" (UID: "a19d874c-b175-4268-87ae-bec2516b1e1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.895312 4841 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.895356 4841 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.895375 4841 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a19d874c-b175-4268-87ae-bec2516b1e1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:01:05 crc kubenswrapper[4841]: I1203 18:01:05.895391 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7dzds\" (UniqueName: \"kubernetes.io/projected/a19d874c-b175-4268-87ae-bec2516b1e1a-kube-api-access-7dzds\") on node \"crc\" DevicePath \"\"" Dec 03 18:01:06 crc kubenswrapper[4841]: I1203 18:01:06.186038 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29413081-tmxhg" event={"ID":"a19d874c-b175-4268-87ae-bec2516b1e1a","Type":"ContainerDied","Data":"d45956fbbbaa7bad93b652bb2083a5c145969863c1cc8486a30ede6904f85ebb"} Dec 03 18:01:06 crc kubenswrapper[4841]: I1203 18:01:06.186418 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d45956fbbbaa7bad93b652bb2083a5c145969863c1cc8486a30ede6904f85ebb" Dec 03 18:01:06 crc kubenswrapper[4841]: I1203 18:01:06.186094 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29413081-tmxhg" Dec 03 18:01:09 crc kubenswrapper[4841]: I1203 18:01:09.316651 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:01:09 crc kubenswrapper[4841]: I1203 18:01:09.317039 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:01:26 crc kubenswrapper[4841]: I1203 18:01:26.331319 4841 scope.go:117] "RemoveContainer" containerID="5c27f1ebca17a3f9be1ab1ea3f0707941dde25a81ec866c7816a41f8f3ecd479" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.316626 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.317269 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.317332 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.318314 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.318427 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" gracePeriod=600 Dec 03 18:01:39 crc kubenswrapper[4841]: E1203 18:01:39.484234 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.599497 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" exitCode=0 Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.599560 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195"} Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.599647 4841 scope.go:117] "RemoveContainer" containerID="44d7a5c743f6af1d637bc99274fd6c329d3411c0bab643733a894f3ea3a95e72" Dec 03 18:01:39 crc kubenswrapper[4841]: I1203 18:01:39.600668 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:01:39 crc kubenswrapper[4841]: E1203 18:01:39.601191 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:01:50 crc kubenswrapper[4841]: I1203 18:01:50.239197 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:01:50 crc kubenswrapper[4841]: E1203 18:01:50.240512 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:01:57 crc kubenswrapper[4841]: I1203 18:01:57.390988 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 18:01:57 crc kubenswrapper[4841]: E1203 18:01:57.916891 4841 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:37688->38.102.83.151:44085: write tcp 38.102.83.151:37688->38.102.83.151:44085: write: broken pipe Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.239321 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:02:01 crc kubenswrapper[4841]: E1203 18:02:01.240009 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.419547 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.419866 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="prometheus" containerID="cri-o://cbb97066852a9f363dfdf3dcd3e2dc7ec1aaa30a4eca966e1b8d0380f30c2d27" gracePeriod=600 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.420189 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="thanos-sidecar" containerID="cri-o://64891335f24364fa91a90849d22a4b856632847545ff51648145a621cdfa9438" gracePeriod=600 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.420404 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="config-reloader" containerID="cri-o://401ac88482de116750c86b28264b2a01507c752d7b7051078f6100509637c850" gracePeriod=600 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.904606 4841 generic.go:334] "Generic (PLEG): container finished" podID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerID="64891335f24364fa91a90849d22a4b856632847545ff51648145a621cdfa9438" exitCode=0 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.904947 4841 generic.go:334] "Generic (PLEG): container finished" podID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerID="401ac88482de116750c86b28264b2a01507c752d7b7051078f6100509637c850" exitCode=0 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.904960 4841 generic.go:334] "Generic (PLEG): container finished" podID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerID="cbb97066852a9f363dfdf3dcd3e2dc7ec1aaa30a4eca966e1b8d0380f30c2d27" exitCode=0 Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.904654 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerDied","Data":"64891335f24364fa91a90849d22a4b856632847545ff51648145a621cdfa9438"} Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.904997 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerDied","Data":"401ac88482de116750c86b28264b2a01507c752d7b7051078f6100509637c850"} Dec 03 18:02:01 crc kubenswrapper[4841]: I1203 18:02:01.905012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerDied","Data":"cbb97066852a9f363dfdf3dcd3e2dc7ec1aaa30a4eca966e1b8d0380f30c2d27"} Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.589418 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639570 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639624 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639700 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639722 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639762 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639803 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639820 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639851 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639918 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xd5bt\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.639935 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.640027 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db\") pod \"67b6908a-d387-4b04-8151-0e3f4c83901b\" (UID: \"67b6908a-d387-4b04-8151-0e3f4c83901b\") " Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.641006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "prometheus-metric-storage-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.650120 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.650545 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.666319 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config" (OuterVolumeSpecName: "config") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.666709 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.676167 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt" (OuterVolumeSpecName: "kube-api-access-xd5bt") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "kube-api-access-xd5bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.693118 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.698644 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.699354 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out" (OuterVolumeSpecName: "config-out") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.699467 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743070 4841 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743100 4841 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-config-out\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743109 4841 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-tls-assets\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743120 4841 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743130 4841 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743139 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743150 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xd5bt\" (UniqueName: \"kubernetes.io/projected/67b6908a-d387-4b04-8151-0e3f4c83901b-kube-api-access-xd5bt\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743159 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743169 4841 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/67b6908a-d387-4b04-8151-0e3f4c83901b-prometheus-metric-storage-db\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.743181 4841 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.790227 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config" (OuterVolumeSpecName: "web-config") pod "67b6908a-d387-4b04-8151-0e3f4c83901b" (UID: "67b6908a-d387-4b04-8151-0e3f4c83901b"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.846115 4841 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/67b6908a-d387-4b04-8151-0e3f4c83901b-web-config\") on node \"crc\" DevicePath \"\"" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.917180 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"67b6908a-d387-4b04-8151-0e3f4c83901b","Type":"ContainerDied","Data":"ff9a9b616e32337b7a497066f5d77dd6de4505771158126f3e57905be15c1d9a"} Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.917235 4841 scope.go:117] "RemoveContainer" containerID="64891335f24364fa91a90849d22a4b856632847545ff51648145a621cdfa9438" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.917277 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.963148 4841 scope.go:117] "RemoveContainer" containerID="401ac88482de116750c86b28264b2a01507c752d7b7051078f6100509637c850" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.963274 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.971431 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.988388 4841 scope.go:117] "RemoveContainer" containerID="cbb97066852a9f363dfdf3dcd3e2dc7ec1aaa30a4eca966e1b8d0380f30c2d27" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.998837 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:02 crc kubenswrapper[4841]: E1203 18:02:02.999342 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="prometheus" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.999367 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="prometheus" Dec 03 18:02:02 crc kubenswrapper[4841]: E1203 18:02:02.999411 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="config-reloader" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.999421 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="config-reloader" Dec 03 18:02:02 crc kubenswrapper[4841]: E1203 18:02:02.999434 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a19d874c-b175-4268-87ae-bec2516b1e1a" containerName="keystone-cron" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.999442 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a19d874c-b175-4268-87ae-bec2516b1e1a" containerName="keystone-cron" Dec 03 18:02:02 crc kubenswrapper[4841]: E1203 18:02:02.999451 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="thanos-sidecar" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.999458 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="thanos-sidecar" Dec 03 18:02:02 crc kubenswrapper[4841]: E1203 18:02:02.999482 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="init-config-reloader" Dec 03 18:02:02 crc kubenswrapper[4841]: I1203 18:02:02.999491 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="init-config-reloader" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:02.999737 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="thanos-sidecar" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:02.999755 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a19d874c-b175-4268-87ae-bec2516b1e1a" containerName="keystone-cron" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:02.999767 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="prometheus" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:02.999786 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" containerName="config-reloader" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.001929 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.004326 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.004720 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.004946 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.004985 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.005050 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-4mqjg" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.012031 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.015448 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.027917 4841 scope.go:117] "RemoveContainer" containerID="1c5f75446cdbe4cd713f74fe99c03603156f3be26d3c09ebcffe149c7d245eb2" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.035129 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049607 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049661 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049712 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049739 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049766 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mhq\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-kube-api-access-59mhq\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049824 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049944 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.049972 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.050007 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.050081 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.050122 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151381 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151426 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151453 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151512 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151540 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151572 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151593 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151621 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.151638 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.152154 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mhq\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-kube-api-access-59mhq\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.152197 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.152441 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-db\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.152712 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d9f984e5-9578-43d5-be4a-4ea8f5634547-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.156631 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.157241 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.157730 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.157802 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.157837 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.158383 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.158812 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d9f984e5-9578-43d5-be4a-4ea8f5634547-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.158822 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d9f984e5-9578-43d5-be4a-4ea8f5634547-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.175001 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mhq\" (UniqueName: \"kubernetes.io/projected/d9f984e5-9578-43d5-be4a-4ea8f5634547-kube-api-access-59mhq\") pod \"prometheus-metric-storage-0\" (UID: \"d9f984e5-9578-43d5-be4a-4ea8f5634547\") " pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:03 crc kubenswrapper[4841]: I1203 18:02:03.408080 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:04 crc kubenswrapper[4841]: I1203 18:02:04.028529 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 18:02:04 crc kubenswrapper[4841]: I1203 18:02:04.257238 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67b6908a-d387-4b04-8151-0e3f4c83901b" path="/var/lib/kubelet/pods/67b6908a-d387-4b04-8151-0e3f4c83901b/volumes" Dec 03 18:02:04 crc kubenswrapper[4841]: I1203 18:02:04.936423 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerStarted","Data":"63f696e783e896e86484131f34fb5eea087215c8fb745119439530f70a6d423e"} Dec 03 18:02:07 crc kubenswrapper[4841]: I1203 18:02:07.970755 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerStarted","Data":"2ec02196f7344dce60bccca61615fe2b041cfcc9aa6bce39a8836ac403579711"} Dec 03 18:02:14 crc kubenswrapper[4841]: I1203 18:02:14.239324 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:02:14 crc kubenswrapper[4841]: E1203 18:02:14.240499 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:02:18 crc kubenswrapper[4841]: I1203 18:02:18.099124 4841 generic.go:334] "Generic (PLEG): container finished" podID="d9f984e5-9578-43d5-be4a-4ea8f5634547" containerID="2ec02196f7344dce60bccca61615fe2b041cfcc9aa6bce39a8836ac403579711" exitCode=0 Dec 03 18:02:18 crc kubenswrapper[4841]: I1203 18:02:18.099282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerDied","Data":"2ec02196f7344dce60bccca61615fe2b041cfcc9aa6bce39a8836ac403579711"} Dec 03 18:02:19 crc kubenswrapper[4841]: I1203 18:02:19.116645 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerStarted","Data":"1b9b20b08135c1dd057d2238dd1a4d3ad622c836f124d4d7f060ec89aabc4359"} Dec 03 18:02:23 crc kubenswrapper[4841]: I1203 18:02:23.160828 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerStarted","Data":"8371b6c61ebafdfd11bbddd76ef575307c3c62f57e76161af3a6fae4c32356a6"} Dec 03 18:02:23 crc kubenswrapper[4841]: I1203 18:02:23.161260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d9f984e5-9578-43d5-be4a-4ea8f5634547","Type":"ContainerStarted","Data":"79c28a48b029ccac63212c1faa2320fbf55950f83a19cee77a879dc198ab58ed"} Dec 03 18:02:23 crc kubenswrapper[4841]: I1203 18:02:23.188307 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=21.188290856 podStartE2EDuration="21.188290856s" podCreationTimestamp="2025-12-03 18:02:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:02:23.18399108 +0000 UTC m=+3737.571511807" watchObservedRunningTime="2025-12-03 18:02:23.188290856 +0000 UTC m=+3737.575811583" Dec 03 18:02:23 crc kubenswrapper[4841]: I1203 18:02:23.408851 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:26 crc kubenswrapper[4841]: I1203 18:02:26.381948 4841 scope.go:117] "RemoveContainer" containerID="c874d8d6035533f3b70ba885e2e7152bb42998e2a90e71342dba8aebb97aa00d" Dec 03 18:02:26 crc kubenswrapper[4841]: I1203 18:02:26.422026 4841 scope.go:117] "RemoveContainer" containerID="f846d9bb3ec456d8b2d776551f93d3f06b4b586b6dd35ea14a65e48617f1b835" Dec 03 18:02:26 crc kubenswrapper[4841]: I1203 18:02:26.449364 4841 scope.go:117] "RemoveContainer" containerID="27109f5910226ee377a1094a5c3273443f2c273b4d853b4378f27b347f25c7cc" Dec 03 18:02:28 crc kubenswrapper[4841]: I1203 18:02:28.240732 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:02:28 crc kubenswrapper[4841]: E1203 18:02:28.241584 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:02:33 crc kubenswrapper[4841]: I1203 18:02:33.409233 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:33 crc kubenswrapper[4841]: I1203 18:02:33.431161 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:34 crc kubenswrapper[4841]: I1203 18:02:34.288462 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 18:02:39 crc kubenswrapper[4841]: I1203 18:02:39.238861 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:02:39 crc kubenswrapper[4841]: E1203 18:02:39.242551 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:02:51 crc kubenswrapper[4841]: I1203 18:02:51.239748 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:02:51 crc kubenswrapper[4841]: E1203 18:02:51.240613 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:03:02 crc kubenswrapper[4841]: I1203 18:03:02.240471 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:03:02 crc kubenswrapper[4841]: E1203 18:03:02.241574 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:03:17 crc kubenswrapper[4841]: I1203 18:03:17.239433 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:03:17 crc kubenswrapper[4841]: E1203 18:03:17.240938 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:03:31 crc kubenswrapper[4841]: I1203 18:03:31.239263 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:03:31 crc kubenswrapper[4841]: E1203 18:03:31.240324 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:03:45 crc kubenswrapper[4841]: I1203 18:03:45.239308 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:03:45 crc kubenswrapper[4841]: E1203 18:03:45.239998 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:03:59 crc kubenswrapper[4841]: I1203 18:03:59.238829 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:03:59 crc kubenswrapper[4841]: E1203 18:03:59.239660 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:04:01 crc kubenswrapper[4841]: I1203 18:04:01.082865 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 18:04:10 crc kubenswrapper[4841]: I1203 18:04:10.239291 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:04:10 crc kubenswrapper[4841]: E1203 18:04:10.240523 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.809154 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zck69/must-gather-vp85t"] Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.811307 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.813155 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zck69"/"kube-root-ca.crt" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.813426 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-zck69"/"default-dockercfg-5l66f" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.814370 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-zck69"/"openshift-service-ca.crt" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.824078 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zck69/must-gather-vp85t"] Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.968134 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvrng\" (UniqueName: \"kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:19 crc kubenswrapper[4841]: I1203 18:04:19.968388 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.070557 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvrng\" (UniqueName: \"kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.070706 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.071342 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.095643 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvrng\" (UniqueName: \"kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng\") pod \"must-gather-vp85t\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.130992 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:04:20 crc kubenswrapper[4841]: I1203 18:04:20.658850 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-zck69/must-gather-vp85t"] Dec 03 18:04:21 crc kubenswrapper[4841]: I1203 18:04:21.534897 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/must-gather-vp85t" event={"ID":"0ab7f9dd-4fcb-478c-96e4-890562d77a9f","Type":"ContainerStarted","Data":"24c5fdbb33ca4b2c0e3a99f0fc8d24b6dd8a58180dc1fb5e0c6d84e79a26c301"} Dec 03 18:04:25 crc kubenswrapper[4841]: I1203 18:04:25.239784 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:04:25 crc kubenswrapper[4841]: E1203 18:04:25.241211 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:04:25 crc kubenswrapper[4841]: I1203 18:04:25.582858 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/must-gather-vp85t" event={"ID":"0ab7f9dd-4fcb-478c-96e4-890562d77a9f","Type":"ContainerStarted","Data":"5fd1536043d4d8bf2c2f92cae9ef92e6247229de1023805a0ce02aa96c3e960f"} Dec 03 18:04:26 crc kubenswrapper[4841]: I1203 18:04:26.597267 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/must-gather-vp85t" event={"ID":"0ab7f9dd-4fcb-478c-96e4-890562d77a9f","Type":"ContainerStarted","Data":"e9377811da049f6061617d2dea16a4b4143be0aa46772ae5267f690dd1eb3f67"} Dec 03 18:04:26 crc kubenswrapper[4841]: I1203 18:04:26.627503 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zck69/must-gather-vp85t" podStartSLOduration=3.283267704 podStartE2EDuration="7.627486345s" podCreationTimestamp="2025-12-03 18:04:19 +0000 UTC" firstStartedPulling="2025-12-03 18:04:20.837356844 +0000 UTC m=+3855.224877571" lastFinishedPulling="2025-12-03 18:04:25.181575445 +0000 UTC m=+3859.569096212" observedRunningTime="2025-12-03 18:04:26.619830776 +0000 UTC m=+3861.007351533" watchObservedRunningTime="2025-12-03 18:04:26.627486345 +0000 UTC m=+3861.015007072" Dec 03 18:04:29 crc kubenswrapper[4841]: E1203 18:04:29.038256 4841 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.151:42952->38.102.83.151:44085: write tcp 38.102.83.151:42952->38.102.83.151:44085: write: broken pipe Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.758491 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zck69/crc-debug-476h6"] Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.759950 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.871746 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6mpp\" (UniqueName: \"kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.872076 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.974378 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6mpp\" (UniqueName: \"kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.974442 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:29 crc kubenswrapper[4841]: I1203 18:04:29.974552 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:30 crc kubenswrapper[4841]: I1203 18:04:30.001786 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6mpp\" (UniqueName: \"kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp\") pod \"crc-debug-476h6\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:30 crc kubenswrapper[4841]: I1203 18:04:30.084761 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:04:30 crc kubenswrapper[4841]: W1203 18:04:30.125984 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod020ea924_b48f_4abe_a818_36001847ecba.slice/crio-ef2d477fc95a8b7fc369e59163d4ee168754ef45fa0e8ecfaa4ef06bacc56d9e WatchSource:0}: Error finding container ef2d477fc95a8b7fc369e59163d4ee168754ef45fa0e8ecfaa4ef06bacc56d9e: Status 404 returned error can't find the container with id ef2d477fc95a8b7fc369e59163d4ee168754ef45fa0e8ecfaa4ef06bacc56d9e Dec 03 18:04:30 crc kubenswrapper[4841]: I1203 18:04:30.663159 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/crc-debug-476h6" event={"ID":"020ea924-b48f-4abe-a818-36001847ecba","Type":"ContainerStarted","Data":"ef2d477fc95a8b7fc369e59163d4ee168754ef45fa0e8ecfaa4ef06bacc56d9e"} Dec 03 18:04:36 crc kubenswrapper[4841]: I1203 18:04:36.242983 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:04:36 crc kubenswrapper[4841]: E1203 18:04:36.243575 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:04:42 crc kubenswrapper[4841]: I1203 18:04:42.793120 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/crc-debug-476h6" event={"ID":"020ea924-b48f-4abe-a818-36001847ecba","Type":"ContainerStarted","Data":"da93101ddee90b8ecd1d93f11ffa31b6f0db1d7c6a9844378a00dce07e401c16"} Dec 03 18:04:42 crc kubenswrapper[4841]: I1203 18:04:42.817576 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-zck69/crc-debug-476h6" podStartSLOduration=2.340395867 podStartE2EDuration="13.817557166s" podCreationTimestamp="2025-12-03 18:04:29 +0000 UTC" firstStartedPulling="2025-12-03 18:04:30.128996735 +0000 UTC m=+3864.516517482" lastFinishedPulling="2025-12-03 18:04:41.606158014 +0000 UTC m=+3875.993678781" observedRunningTime="2025-12-03 18:04:42.808723148 +0000 UTC m=+3877.196243875" watchObservedRunningTime="2025-12-03 18:04:42.817557166 +0000 UTC m=+3877.205077913" Dec 03 18:04:47 crc kubenswrapper[4841]: I1203 18:04:47.238570 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:04:47 crc kubenswrapper[4841]: E1203 18:04:47.239240 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:04:59 crc kubenswrapper[4841]: I1203 18:04:59.975393 4841 generic.go:334] "Generic (PLEG): container finished" podID="020ea924-b48f-4abe-a818-36001847ecba" containerID="da93101ddee90b8ecd1d93f11ffa31b6f0db1d7c6a9844378a00dce07e401c16" exitCode=0 Dec 03 18:04:59 crc kubenswrapper[4841]: I1203 18:04:59.975461 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/crc-debug-476h6" event={"ID":"020ea924-b48f-4abe-a818-36001847ecba","Type":"ContainerDied","Data":"da93101ddee90b8ecd1d93f11ffa31b6f0db1d7c6a9844378a00dce07e401c16"} Dec 03 18:05:00 crc kubenswrapper[4841]: I1203 18:05:00.241625 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:05:00 crc kubenswrapper[4841]: E1203 18:05:00.242809 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.133130 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.144237 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6mpp\" (UniqueName: \"kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp\") pod \"020ea924-b48f-4abe-a818-36001847ecba\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.144353 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host\") pod \"020ea924-b48f-4abe-a818-36001847ecba\" (UID: \"020ea924-b48f-4abe-a818-36001847ecba\") " Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.144483 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host" (OuterVolumeSpecName: "host") pod "020ea924-b48f-4abe-a818-36001847ecba" (UID: "020ea924-b48f-4abe-a818-36001847ecba"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.145022 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/020ea924-b48f-4abe-a818-36001847ecba-host\") on node \"crc\" DevicePath \"\"" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.151344 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp" (OuterVolumeSpecName: "kube-api-access-b6mpp") pod "020ea924-b48f-4abe-a818-36001847ecba" (UID: "020ea924-b48f-4abe-a818-36001847ecba"). InnerVolumeSpecName "kube-api-access-b6mpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.166574 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zck69/crc-debug-476h6"] Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.175430 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zck69/crc-debug-476h6"] Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.246081 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6mpp\" (UniqueName: \"kubernetes.io/projected/020ea924-b48f-4abe-a818-36001847ecba-kube-api-access-b6mpp\") on node \"crc\" DevicePath \"\"" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.997082 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef2d477fc95a8b7fc369e59163d4ee168754ef45fa0e8ecfaa4ef06bacc56d9e" Dec 03 18:05:01 crc kubenswrapper[4841]: I1203 18:05:01.997114 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-476h6" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.251397 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ea924-b48f-4abe-a818-36001847ecba" path="/var/lib/kubelet/pods/020ea924-b48f-4abe-a818-36001847ecba/volumes" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.379161 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-zck69/crc-debug-dxq58"] Dec 03 18:05:02 crc kubenswrapper[4841]: E1203 18:05:02.379535 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020ea924-b48f-4abe-a818-36001847ecba" containerName="container-00" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.379553 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="020ea924-b48f-4abe-a818-36001847ecba" containerName="container-00" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.379785 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="020ea924-b48f-4abe-a818-36001847ecba" containerName="container-00" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.380423 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.467959 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvqdp\" (UniqueName: \"kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.468127 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.570188 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.570266 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvqdp\" (UniqueName: \"kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.570389 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.586866 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvqdp\" (UniqueName: \"kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp\") pod \"crc-debug-dxq58\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:02 crc kubenswrapper[4841]: I1203 18:05:02.696815 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:03 crc kubenswrapper[4841]: I1203 18:05:03.007883 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/crc-debug-dxq58" event={"ID":"b3258858-288a-4f70-bd49-0c93c0cac9b9","Type":"ContainerStarted","Data":"eae0df22a61d7fa8afa1daf48aa8430f5cf58a9917417da1327d4fd5436ee25f"} Dec 03 18:05:03 crc kubenswrapper[4841]: I1203 18:05:03.008288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/crc-debug-dxq58" event={"ID":"b3258858-288a-4f70-bd49-0c93c0cac9b9","Type":"ContainerStarted","Data":"66ff575680ceaccdb05db30a5024d231dbe4c351c0a12fb3b2ec1fe8e89d865e"} Dec 03 18:05:03 crc kubenswrapper[4841]: I1203 18:05:03.044434 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zck69/crc-debug-dxq58"] Dec 03 18:05:03 crc kubenswrapper[4841]: I1203 18:05:03.053282 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zck69/crc-debug-dxq58"] Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.017839 4841 generic.go:334] "Generic (PLEG): container finished" podID="b3258858-288a-4f70-bd49-0c93c0cac9b9" containerID="eae0df22a61d7fa8afa1daf48aa8430f5cf58a9917417da1327d4fd5436ee25f" exitCode=1 Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.124690 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.305739 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host\") pod \"b3258858-288a-4f70-bd49-0c93c0cac9b9\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.305855 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvqdp\" (UniqueName: \"kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp\") pod \"b3258858-288a-4f70-bd49-0c93c0cac9b9\" (UID: \"b3258858-288a-4f70-bd49-0c93c0cac9b9\") " Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.306693 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host" (OuterVolumeSpecName: "host") pod "b3258858-288a-4f70-bd49-0c93c0cac9b9" (UID: "b3258858-288a-4f70-bd49-0c93c0cac9b9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.307418 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b3258858-288a-4f70-bd49-0c93c0cac9b9-host\") on node \"crc\" DevicePath \"\"" Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.314204 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp" (OuterVolumeSpecName: "kube-api-access-zvqdp") pod "b3258858-288a-4f70-bd49-0c93c0cac9b9" (UID: "b3258858-288a-4f70-bd49-0c93c0cac9b9"). InnerVolumeSpecName "kube-api-access-zvqdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:05:04 crc kubenswrapper[4841]: I1203 18:05:04.408733 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvqdp\" (UniqueName: \"kubernetes.io/projected/b3258858-288a-4f70-bd49-0c93c0cac9b9-kube-api-access-zvqdp\") on node \"crc\" DevicePath \"\"" Dec 03 18:05:05 crc kubenswrapper[4841]: I1203 18:05:05.028761 4841 scope.go:117] "RemoveContainer" containerID="eae0df22a61d7fa8afa1daf48aa8430f5cf58a9917417da1327d4fd5436ee25f" Dec 03 18:05:05 crc kubenswrapper[4841]: I1203 18:05:05.028802 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/crc-debug-dxq58" Dec 03 18:05:06 crc kubenswrapper[4841]: I1203 18:05:06.251933 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3258858-288a-4f70-bd49-0c93c0cac9b9" path="/var/lib/kubelet/pods/b3258858-288a-4f70-bd49-0c93c0cac9b9/volumes" Dec 03 18:05:14 crc kubenswrapper[4841]: I1203 18:05:14.239251 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:05:14 crc kubenswrapper[4841]: E1203 18:05:14.240024 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:05:25 crc kubenswrapper[4841]: I1203 18:05:25.239449 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:05:25 crc kubenswrapper[4841]: E1203 18:05:25.240493 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:05:36 crc kubenswrapper[4841]: I1203 18:05:36.245337 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:05:36 crc kubenswrapper[4841]: E1203 18:05:36.245992 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:05:48 crc kubenswrapper[4841]: I1203 18:05:48.239156 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:05:48 crc kubenswrapper[4841]: E1203 18:05:48.240108 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.414099 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/init-config-reloader/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.604836 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/init-config-reloader/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.637208 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/config-reloader/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.648843 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/alertmanager/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.884156 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-api/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.911324 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-listener/0.log" Dec 03 18:05:55 crc kubenswrapper[4841]: I1203 18:05:55.929033 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-evaluator/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.057850 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-notifier/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.100163 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c8cf4866-6qqks_a2adaec8-2204-42ce-bc82-2f7e45008cad/barbican-api/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.279433 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745f9599f8-67b5b_ec85a1b0-91a1-4d24-a64b-239c100a7861/barbican-keystone-listener/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.284712 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c8cf4866-6qqks_a2adaec8-2204-42ce-bc82-2f7e45008cad/barbican-api-log/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.304062 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745f9599f8-67b5b_ec85a1b0-91a1-4d24-a64b-239c100a7861/barbican-keystone-listener-log/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.524076 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b7bb8bfcf-5cwg2_e2bde16a-a813-49ff-ac28-11cf8d1dfac4/barbican-worker-log/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.537772 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b7bb8bfcf-5cwg2_e2bde16a-a813-49ff-ac28-11cf8d1dfac4/barbican-worker/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.797744 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc_15e8ed9f-b5ae-44bb-b295-1222cdad5513/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.811182 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/ceilometer-central-agent/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.844757 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/ceilometer-notification-agent/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.936224 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/proxy-httpd/0.log" Dec 03 18:05:56 crc kubenswrapper[4841]: I1203 18:05:56.997240 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/sg-core/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.292332 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c12abe1e-e6a0-4bed-9bab-feb7bf43622d/cinder-api/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.331523 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c12abe1e-e6a0-4bed-9bab-feb7bf43622d/cinder-api-log/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.458602 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b91ca91-fe2e-4f87-948f-f4db1f3a5854/cinder-scheduler/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.524204 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b91ca91-fe2e-4f87-948f-f4db1f3a5854/probe/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.651635 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b_b09b36ac-de85-4fa1-ab95-1c24c0c33c0d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.739445 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt_b0be8d7d-6270-4255-8c6d-6a50f8c741a2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:57 crc kubenswrapper[4841]: I1203 18:05:57.860845 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/init/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.014993 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/init/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.033318 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/dnsmasq-dns/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.111664 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sx49z_dfb9c7ef-19c9-4582-b13c-c399a2ef4e73/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.247726 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3c031e76-65c3-4f33-a588-3a76aa8a2c0b/glance-httpd/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.297017 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3c031e76-65c3-4f33-a588-3a76aa8a2c0b/glance-log/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.458573 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a0e112d6-6648-48f5-872e-f4ac5e81de4e/glance-httpd/0.log" Dec 03 18:05:58 crc kubenswrapper[4841]: I1203 18:05:58.501839 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a0e112d6-6648-48f5-872e-f4ac5e81de4e/glance-log/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.256209 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-555b568f78-v86bh_8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d/heat-engine/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.347233 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-c5f5d9bb6-ddbgn_bfc25bf9-7fd7-4f92-9dbb-61f291592975/heat-cfnapi/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.348742 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-799f468d8f-qbwl4_839b1e72-e619-4c4c-80ac-0754251beb2a/heat-api/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.470137 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7_1e098ac8-ac99-4b82-8723-7171dbb84329/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.574602 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r5qds_ddd71965-3c25-46fe-a129-4e674bf7dcca/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.703768 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59cdc4489f-kzmfh_471f221b-da02-49b2-901b-c8afd7aa38c5/keystone-api/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.766112 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29413081-tmxhg_a19d874c-b175-4268-87ae-bec2516b1e1a/keystone-cron/0.log" Dec 03 18:05:59 crc kubenswrapper[4841]: I1203 18:05:59.836431 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_18b7d958-8083-489d-ab83-9cc342dbad71/kube-state-metrics/0.log" Dec 03 18:06:00 crc kubenswrapper[4841]: I1203 18:06:00.147297 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s_fbcfe5cb-55b6-4840-ad0c-a916165933d6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:00 crc kubenswrapper[4841]: I1203 18:06:00.325743 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fbf7d7cfc-n8r2b_7e14311f-cc1f-454b-af0b-94a5cf3ed4e3/neutron-api/0.log" Dec 03 18:06:00 crc kubenswrapper[4841]: I1203 18:06:00.377467 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fbf7d7cfc-n8r2b_7e14311f-cc1f-454b-af0b-94a5cf3ed4e3/neutron-httpd/0.log" Dec 03 18:06:00 crc kubenswrapper[4841]: I1203 18:06:00.694732 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt_390fe67f-4d0d-459c-9f27-d6cc843c2d55/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:00 crc kubenswrapper[4841]: I1203 18:06:00.983517 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4399b120-7a3b-430f-ad42-21a2c9bd0af5/nova-api-log/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.145277 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fd4f28bc-9247-408c-a91e-94a0c739bfce/nova-cell0-conductor-conductor/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.274686 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4399b120-7a3b-430f-ad42-21a2c9bd0af5/nova-api-api/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.322995 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_de4fa66a-9d61-40a5-97f9-4c7841e1ca58/nova-cell1-conductor-conductor/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.658315 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_187b2155-68e5-419d-b438-e22374486ae8/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.783482 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7s8tk_986f7983-1ff5-4510-a8e9-0e45c0fddd19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:01 crc kubenswrapper[4841]: I1203 18:06:01.995368 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a3d1d17d-16e0-4160-93bc-3a926fedbfbd/nova-metadata-log/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.183433 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0/nova-scheduler-scheduler/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.238620 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:06:02 crc kubenswrapper[4841]: E1203 18:06:02.239752 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.297066 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/mysql-bootstrap/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.520353 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/mysql-bootstrap/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.576597 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/galera/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.714804 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/mysql-bootstrap/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.952341 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/mysql-bootstrap/0.log" Dec 03 18:06:02 crc kubenswrapper[4841]: I1203 18:06:02.965784 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/galera/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.202187 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7/openstackclient/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.277897 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-cqt22_41fcd3cb-c81b-4a15-bb57-aa38bfa47e41/ovn-controller/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.314638 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a3d1d17d-16e0-4160-93bc-3a926fedbfbd/nova-metadata-metadata/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.459174 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6vxfx_805e03d0-c25b-4b59-8d0b-d526bc7fcc85/openstack-network-exporter/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.546866 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server-init/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.760110 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovs-vswitchd/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.811240 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server-init/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.832548 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server/0.log" Dec 03 18:06:03 crc kubenswrapper[4841]: I1203 18:06:03.987435 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-js7nc_14e63bf1-717a-40b6-8c5d-e46bf40c68dc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.036311 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbd132af-f941-486f-8791-402bae76197f/ovn-northd/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.086747 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbd132af-f941-486f-8791-402bae76197f/openstack-network-exporter/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.320452 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a85311e5-2270-4d86-a617-1b7da0a346c8/openstack-network-exporter/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.366482 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a85311e5-2270-4d86-a617-1b7da0a346c8/ovsdbserver-nb/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.488619 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f9b50c59-2571-4a25-bff5-bc84b18d7315/openstack-network-exporter/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.553929 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f9b50c59-2571-4a25-bff5-bc84b18d7315/ovsdbserver-sb/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.617053 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59984668d4-h88x4_753fad74-21af-48dd-ae45-1162eb580f22/placement-api/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.809751 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59984668d4-h88x4_753fad74-21af-48dd-ae45-1162eb580f22/placement-log/0.log" Dec 03 18:06:04 crc kubenswrapper[4841]: I1203 18:06:04.868189 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/init-config-reloader/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.030465 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/init-config-reloader/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.037705 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/config-reloader/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.069991 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/thanos-sidecar/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.076912 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/prometheus/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.271934 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/setup-container/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.425515 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/rabbitmq/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.454368 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/setup-container/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.551629 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/setup-container/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.719391 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/setup-container/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.785684 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/rabbitmq/0.log" Dec 03 18:06:05 crc kubenswrapper[4841]: I1203 18:06:05.867310 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp_c815359c-145d-48c6-936f-98c8f4cf8fff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.031077 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-q7w8g_b310e506-2bc4-400e-acd0-749838969d1c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.067484 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx_c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.220572 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf2mz_4f7cc5b5-8153-47ba-9f43-5e188c86d8c0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.286155 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8rlr_c05eea2a-71d0-483b-a0b9-92b28743b13e/ssh-known-hosts-edpm-deployment/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.521485 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6df7dcffd7-hvqxv_8efd46ec-9481-40f8-be85-637ddafc2291/proxy-server/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.650415 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hzcwz_8be83cad-e31a-463f-9eca-837549c69fba/swift-ring-rebalance/0.log" Dec 03 18:06:06 crc kubenswrapper[4841]: I1203 18:06:06.694202 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6df7dcffd7-hvqxv_8efd46ec-9481-40f8-be85-637ddafc2291/proxy-httpd/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.062193 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-auditor/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.104602 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-reaper/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.165990 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-replicator/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.187961 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-server/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.219356 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-auditor/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.319915 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-replicator/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.412862 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-server/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.424062 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-updater/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.431916 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-auditor/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.523631 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-expirer/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.579291 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-server/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.629062 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-replicator/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.814135 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-updater/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.880289 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:07 crc kubenswrapper[4841]: E1203 18:06:07.880855 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3258858-288a-4f70-bd49-0c93c0cac9b9" containerName="container-00" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.880878 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3258858-288a-4f70-bd49-0c93c0cac9b9" containerName="container-00" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.881183 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3258858-288a-4f70-bd49-0c93c0cac9b9" containerName="container-00" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.883117 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.889658 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.894821 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzwg9\" (UniqueName: \"kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.894978 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.895697 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.896624 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/rsync/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.931930 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/swift-recon-cron/0.log" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.998561 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzwg9\" (UniqueName: \"kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.998649 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.998687 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.999325 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:07 crc kubenswrapper[4841]: I1203 18:06:07.999945 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:08 crc kubenswrapper[4841]: I1203 18:06:08.019735 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzwg9\" (UniqueName: \"kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9\") pod \"redhat-operators-5b8wg\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:08 crc kubenswrapper[4841]: I1203 18:06:08.164522 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-82fk8_902cbae6-47ea-4334-8623-8148bb196870/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:08 crc kubenswrapper[4841]: I1203 18:06:08.192535 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf_8880e946-3512-4dfc-9d56-c3210fd50e21/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:06:08 crc kubenswrapper[4841]: I1203 18:06:08.226595 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:08 crc kubenswrapper[4841]: I1203 18:06:08.811622 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:09 crc kubenswrapper[4841]: I1203 18:06:09.831245 4841 generic.go:334] "Generic (PLEG): container finished" podID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerID="cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6" exitCode=0 Dec 03 18:06:09 crc kubenswrapper[4841]: I1203 18:06:09.831726 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerDied","Data":"cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6"} Dec 03 18:06:09 crc kubenswrapper[4841]: I1203 18:06:09.831795 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerStarted","Data":"adde3dbba33ef0d60ffb578ed744c626a1a3bd3e9ef211a6f97168a9c405cbb3"} Dec 03 18:06:09 crc kubenswrapper[4841]: I1203 18:06:09.833384 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 18:06:11 crc kubenswrapper[4841]: I1203 18:06:11.854852 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerStarted","Data":"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade"} Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.061178 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.063712 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.071811 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.072188 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.072235 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wtqb\" (UniqueName: \"kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.111017 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.174579 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.174650 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wtqb\" (UniqueName: \"kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.174685 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.175659 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.177843 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.213845 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wtqb\" (UniqueName: \"kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb\") pod \"certified-operators-72zlb\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:12 crc kubenswrapper[4841]: I1203 18:06:12.426327 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:13 crc kubenswrapper[4841]: I1203 18:06:13.166669 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:13 crc kubenswrapper[4841]: I1203 18:06:13.239038 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:06:13 crc kubenswrapper[4841]: E1203 18:06:13.239454 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:06:13 crc kubenswrapper[4841]: I1203 18:06:13.892108 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerStarted","Data":"30b891fb109baa1826b045efacf7b2433f9d7dee8b1593be4c26250616f12a87"} Dec 03 18:06:14 crc kubenswrapper[4841]: I1203 18:06:14.905639 4841 generic.go:334] "Generic (PLEG): container finished" podID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerID="6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade" exitCode=0 Dec 03 18:06:14 crc kubenswrapper[4841]: I1203 18:06:14.905683 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerDied","Data":"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade"} Dec 03 18:06:14 crc kubenswrapper[4841]: I1203 18:06:14.911543 4841 generic.go:334] "Generic (PLEG): container finished" podID="60b528b5-410f-4202-a69c-a8f2efec5562" containerID="33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612" exitCode=0 Dec 03 18:06:14 crc kubenswrapper[4841]: I1203 18:06:14.911579 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerDied","Data":"33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612"} Dec 03 18:06:16 crc kubenswrapper[4841]: I1203 18:06:16.935192 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerStarted","Data":"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa"} Dec 03 18:06:16 crc kubenswrapper[4841]: I1203 18:06:16.939078 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerStarted","Data":"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3"} Dec 03 18:06:16 crc kubenswrapper[4841]: I1203 18:06:16.963418 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-5b8wg" podStartSLOduration=3.517671637 podStartE2EDuration="9.963394207s" podCreationTimestamp="2025-12-03 18:06:07 +0000 UTC" firstStartedPulling="2025-12-03 18:06:09.833174215 +0000 UTC m=+3964.220694942" lastFinishedPulling="2025-12-03 18:06:16.278896785 +0000 UTC m=+3970.666417512" observedRunningTime="2025-12-03 18:06:16.955357999 +0000 UTC m=+3971.342878726" watchObservedRunningTime="2025-12-03 18:06:16.963394207 +0000 UTC m=+3971.350914934" Dec 03 18:06:17 crc kubenswrapper[4841]: I1203 18:06:17.370735 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_81cb6c96-a5d5-4120-8cb3-101344626b07/memcached/0.log" Dec 03 18:06:18 crc kubenswrapper[4841]: I1203 18:06:18.226717 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:18 crc kubenswrapper[4841]: I1203 18:06:18.227036 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:19 crc kubenswrapper[4841]: I1203 18:06:19.280338 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-5b8wg" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="registry-server" probeResult="failure" output=< Dec 03 18:06:19 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 18:06:19 crc kubenswrapper[4841]: > Dec 03 18:06:19 crc kubenswrapper[4841]: I1203 18:06:19.965538 4841 generic.go:334] "Generic (PLEG): container finished" podID="60b528b5-410f-4202-a69c-a8f2efec5562" containerID="ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3" exitCode=0 Dec 03 18:06:19 crc kubenswrapper[4841]: I1203 18:06:19.965589 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerDied","Data":"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3"} Dec 03 18:06:20 crc kubenswrapper[4841]: I1203 18:06:20.977246 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerStarted","Data":"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b"} Dec 03 18:06:21 crc kubenswrapper[4841]: I1203 18:06:21.008427 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-72zlb" podStartSLOduration=3.565545471 podStartE2EDuration="9.008411032s" podCreationTimestamp="2025-12-03 18:06:12 +0000 UTC" firstStartedPulling="2025-12-03 18:06:14.913318905 +0000 UTC m=+3969.300839632" lastFinishedPulling="2025-12-03 18:06:20.356184456 +0000 UTC m=+3974.743705193" observedRunningTime="2025-12-03 18:06:21.003422859 +0000 UTC m=+3975.390943586" watchObservedRunningTime="2025-12-03 18:06:21.008411032 +0000 UTC m=+3975.395931759" Dec 03 18:06:22 crc kubenswrapper[4841]: I1203 18:06:22.431276 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:22 crc kubenswrapper[4841]: I1203 18:06:22.431568 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:22 crc kubenswrapper[4841]: I1203 18:06:22.480720 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:27 crc kubenswrapper[4841]: I1203 18:06:27.239509 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:06:27 crc kubenswrapper[4841]: E1203 18:06:27.240306 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:06:28 crc kubenswrapper[4841]: I1203 18:06:28.302746 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:28 crc kubenswrapper[4841]: I1203 18:06:28.367969 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:28 crc kubenswrapper[4841]: I1203 18:06:28.545334 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.064680 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-5b8wg" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="registry-server" containerID="cri-o://f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa" gracePeriod=2 Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.624505 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.771670 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities\") pod \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.771963 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content\") pod \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.772113 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzwg9\" (UniqueName: \"kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9\") pod \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\" (UID: \"a623b8e4-f243-4eb5-b3da-697cfe9bd18f\") " Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.772453 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities" (OuterVolumeSpecName: "utilities") pod "a623b8e4-f243-4eb5-b3da-697cfe9bd18f" (UID: "a623b8e4-f243-4eb5-b3da-697cfe9bd18f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.772883 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.780629 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9" (OuterVolumeSpecName: "kube-api-access-hzwg9") pod "a623b8e4-f243-4eb5-b3da-697cfe9bd18f" (UID: "a623b8e4-f243-4eb5-b3da-697cfe9bd18f"). InnerVolumeSpecName "kube-api-access-hzwg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.876632 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hzwg9\" (UniqueName: \"kubernetes.io/projected/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-kube-api-access-hzwg9\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.888601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a623b8e4-f243-4eb5-b3da-697cfe9bd18f" (UID: "a623b8e4-f243-4eb5-b3da-697cfe9bd18f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:06:30 crc kubenswrapper[4841]: I1203 18:06:30.978317 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a623b8e4-f243-4eb5-b3da-697cfe9bd18f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.077960 4841 generic.go:334] "Generic (PLEG): container finished" podID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerID="f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa" exitCode=0 Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.078012 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerDied","Data":"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa"} Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.078046 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-5b8wg" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.078069 4841 scope.go:117] "RemoveContainer" containerID="f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.078053 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-5b8wg" event={"ID":"a623b8e4-f243-4eb5-b3da-697cfe9bd18f","Type":"ContainerDied","Data":"adde3dbba33ef0d60ffb578ed744c626a1a3bd3e9ef211a6f97168a9c405cbb3"} Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.107102 4841 scope.go:117] "RemoveContainer" containerID="6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.148302 4841 scope.go:117] "RemoveContainer" containerID="cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.160741 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.181542 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-5b8wg"] Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.194542 4841 scope.go:117] "RemoveContainer" containerID="f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa" Dec 03 18:06:31 crc kubenswrapper[4841]: E1203 18:06:31.195064 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa\": container with ID starting with f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa not found: ID does not exist" containerID="f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.195163 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa"} err="failed to get container status \"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa\": rpc error: code = NotFound desc = could not find container \"f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa\": container with ID starting with f6b998226cdbdfda7fbec115841e4223bb388084e2cba1b6de7d9acb746141fa not found: ID does not exist" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.195198 4841 scope.go:117] "RemoveContainer" containerID="6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade" Dec 03 18:06:31 crc kubenswrapper[4841]: E1203 18:06:31.195506 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade\": container with ID starting with 6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade not found: ID does not exist" containerID="6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.195557 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade"} err="failed to get container status \"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade\": rpc error: code = NotFound desc = could not find container \"6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade\": container with ID starting with 6d236213f2d0dd4fc3e8a6d50bc91080fc529f4850a21db8deadb42ec71d6ade not found: ID does not exist" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.195587 4841 scope.go:117] "RemoveContainer" containerID="cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6" Dec 03 18:06:31 crc kubenswrapper[4841]: E1203 18:06:31.196046 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6\": container with ID starting with cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6 not found: ID does not exist" containerID="cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6" Dec 03 18:06:31 crc kubenswrapper[4841]: I1203 18:06:31.196070 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6"} err="failed to get container status \"cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6\": rpc error: code = NotFound desc = could not find container \"cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6\": container with ID starting with cc389f1d30b4d4aa064f6017b3fe0ad76c0c38cbbc52bef7126be05a0f118db6 not found: ID does not exist" Dec 03 18:06:32 crc kubenswrapper[4841]: I1203 18:06:32.252344 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" path="/var/lib/kubelet/pods/a623b8e4-f243-4eb5-b3da-697cfe9bd18f/volumes" Dec 03 18:06:32 crc kubenswrapper[4841]: I1203 18:06:32.497571 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:32 crc kubenswrapper[4841]: I1203 18:06:32.945086 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.099319 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-72zlb" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="registry-server" containerID="cri-o://5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b" gracePeriod=2 Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.556127 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.728709 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content\") pod \"60b528b5-410f-4202-a69c-a8f2efec5562\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.729032 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wtqb\" (UniqueName: \"kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb\") pod \"60b528b5-410f-4202-a69c-a8f2efec5562\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.729248 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities\") pod \"60b528b5-410f-4202-a69c-a8f2efec5562\" (UID: \"60b528b5-410f-4202-a69c-a8f2efec5562\") " Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.729810 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities" (OuterVolumeSpecName: "utilities") pod "60b528b5-410f-4202-a69c-a8f2efec5562" (UID: "60b528b5-410f-4202-a69c-a8f2efec5562"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.740185 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb" (OuterVolumeSpecName: "kube-api-access-7wtqb") pod "60b528b5-410f-4202-a69c-a8f2efec5562" (UID: "60b528b5-410f-4202-a69c-a8f2efec5562"). InnerVolumeSpecName "kube-api-access-7wtqb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.782969 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60b528b5-410f-4202-a69c-a8f2efec5562" (UID: "60b528b5-410f-4202-a69c-a8f2efec5562"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.831362 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.831390 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wtqb\" (UniqueName: \"kubernetes.io/projected/60b528b5-410f-4202-a69c-a8f2efec5562-kube-api-access-7wtqb\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:33 crc kubenswrapper[4841]: I1203 18:06:33.831400 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60b528b5-410f-4202-a69c-a8f2efec5562-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.122292 4841 generic.go:334] "Generic (PLEG): container finished" podID="60b528b5-410f-4202-a69c-a8f2efec5562" containerID="5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b" exitCode=0 Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.122378 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerDied","Data":"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b"} Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.122433 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-72zlb" event={"ID":"60b528b5-410f-4202-a69c-a8f2efec5562","Type":"ContainerDied","Data":"30b891fb109baa1826b045efacf7b2433f9d7dee8b1593be4c26250616f12a87"} Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.122471 4841 scope.go:117] "RemoveContainer" containerID="5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.122736 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-72zlb" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.161674 4841 scope.go:117] "RemoveContainer" containerID="ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.177578 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.193040 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-72zlb"] Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.214864 4841 scope.go:117] "RemoveContainer" containerID="33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.253581 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" path="/var/lib/kubelet/pods/60b528b5-410f-4202-a69c-a8f2efec5562/volumes" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.256593 4841 scope.go:117] "RemoveContainer" containerID="5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b" Dec 03 18:06:34 crc kubenswrapper[4841]: E1203 18:06:34.257431 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b\": container with ID starting with 5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b not found: ID does not exist" containerID="5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.257481 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b"} err="failed to get container status \"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b\": rpc error: code = NotFound desc = could not find container \"5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b\": container with ID starting with 5e8ed6a8c5659dede443c99ef78f942a215620db44b8719666b59743447ff68b not found: ID does not exist" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.257509 4841 scope.go:117] "RemoveContainer" containerID="ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3" Dec 03 18:06:34 crc kubenswrapper[4841]: E1203 18:06:34.257954 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3\": container with ID starting with ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3 not found: ID does not exist" containerID="ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.257994 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3"} err="failed to get container status \"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3\": rpc error: code = NotFound desc = could not find container \"ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3\": container with ID starting with ea1eafabe52364ef821017b4f35092e6da8849c7fd517ecc00ebf0d9e65d4fd3 not found: ID does not exist" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.258019 4841 scope.go:117] "RemoveContainer" containerID="33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612" Dec 03 18:06:34 crc kubenswrapper[4841]: E1203 18:06:34.258427 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612\": container with ID starting with 33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612 not found: ID does not exist" containerID="33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612" Dec 03 18:06:34 crc kubenswrapper[4841]: I1203 18:06:34.258460 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612"} err="failed to get container status \"33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612\": rpc error: code = NotFound desc = could not find container \"33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612\": container with ID starting with 33531fc5bb41f3b0ae1f55e95c09ea23d565b3ed81388187b591d1a4006db612 not found: ID does not exist" Dec 03 18:06:36 crc kubenswrapper[4841]: I1203 18:06:36.875610 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.032635 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.073790 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.133563 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.295424 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.339622 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.351153 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/extract/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.494734 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hwp8p_a6ef72b8-96de-4545-9100-081f42138dff/kube-rbac-proxy/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.600785 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgzs7_6dbdda39-de04-49e2-8667-58eb77b076b9/kube-rbac-proxy/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.608540 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hwp8p_a6ef72b8-96de-4545-9100-081f42138dff/manager/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.749063 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgzs7_6dbdda39-de04-49e2-8667-58eb77b076b9/manager/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.819928 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-bfvv8_d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8/manager/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.824775 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-bfvv8_d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8/kube-rbac-proxy/0.log" Dec 03 18:06:37 crc kubenswrapper[4841]: I1203 18:06:37.963571 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-jmzr6_c969bc4d-df07-4ec7-b406-7de0710faca8/kube-rbac-proxy/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.048819 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-jmzr6_c969bc4d-df07-4ec7-b406-7de0710faca8/manager/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.382060 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z8zb5_cae5c7a3-2395-4cfe-93f2-5a7301c52444/kube-rbac-proxy/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.447206 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z8zb5_cae5c7a3-2395-4cfe-93f2-5a7301c52444/manager/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.478645 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8jcwp_096189c4-aa40-4a3d-b8df-f8dbfa674e08/kube-rbac-proxy/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.569358 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8jcwp_096189c4-aa40-4a3d-b8df-f8dbfa674e08/manager/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.643272 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-s2b4r_b9bdf600-ace4-4f28-80c9-3dd36cf449ad/kube-rbac-proxy/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.817974 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-s2b4r_b9bdf600-ace4-4f28-80c9-3dd36cf449ad/manager/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.862028 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jfddn_6384ded0-4512-4d89-bef4-004339bb019d/kube-rbac-proxy/0.log" Dec 03 18:06:38 crc kubenswrapper[4841]: I1203 18:06:38.903558 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jfddn_6384ded0-4512-4d89-bef4-004339bb019d/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.015096 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fj8w7_0dda1581-f45b-42cd-840f-9b8f2d7a48b1/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.134814 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fj8w7_0dda1581-f45b-42cd-840f-9b8f2d7a48b1/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.194391 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-khncz_38a95f1c-87ae-4464-b6fa-ad329d17290e/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.231936 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-khncz_38a95f1c-87ae-4464-b6fa-ad329d17290e/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.340656 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-t9pmr_abd88bfa-5c17-4486-a051-50c1ceaafe60/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.391765 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-t9pmr_abd88bfa-5c17-4486-a051-50c1ceaafe60/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.593467 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-627sl_36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.595724 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-627sl_36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.711521 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sl5jm_a253f1cc-d669-490e-9bf4-aff2e95347b0/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.842892 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l56t4_b24334e0-1dd6-4667-8ce1-6013cc71dd7f/kube-rbac-proxy/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.853978 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sl5jm_a253f1cc-d669-490e-9bf4-aff2e95347b0/manager/0.log" Dec 03 18:06:39 crc kubenswrapper[4841]: I1203 18:06:39.909646 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l56t4_b24334e0-1dd6-4667-8ce1-6013cc71dd7f/manager/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.022153 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c_7e01626f-e7f3-4c48-bc9b-5d9261b3d89a/manager/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.052837 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c_7e01626f-e7f3-4c48-bc9b-5d9261b3d89a/kube-rbac-proxy/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.429170 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6fdc9d4685-8lhfc_91effd10-805a-48e2-a65a-529fe1e33a37/operator/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.499648 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ldgf7_dca21fb1-4a3f-4003-a1e2-46c1b191b911/registry-server/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.669090 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2d7g6_7b153ea5-5794-46c6-a3f3-099b3b45dfef/kube-rbac-proxy/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.784686 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2d7g6_7b153ea5-5794-46c6-a3f3-099b3b45dfef/manager/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.909831 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q9b79_a4eccc19-eb01-4b44-99e0-041144e4b409/kube-rbac-proxy/0.log" Dec 03 18:06:40 crc kubenswrapper[4841]: I1203 18:06:40.909975 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q9b79_a4eccc19-eb01-4b44-99e0-041144e4b409/manager/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.068348 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-tv5cl_8a13d35a-d714-4a7f-922b-a6d3a0b580c3/operator/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.157626 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-km89l_70e46d25-a5c6-49b4-b3d5-0828bc234644/kube-rbac-proxy/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.217468 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-km89l_70e46d25-a5c6-49b4-b3d5-0828bc234644/manager/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.411709 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/kube-rbac-proxy/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.554056 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fnjsz_56b976ca-c419-42f4-b063-c0219f4e0a72/kube-rbac-proxy/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.622587 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c7f74d46b-4txld_3448d609-0836-4562-ac6b-03d353471880/manager/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.627916 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.662863 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fnjsz_56b976ca-c419-42f4-b063-c0219f4e0a72/manager/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.761494 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r9rbr_fe8b56bd-b492-48cf-a3f2-621b4f58d29c/kube-rbac-proxy/0.log" Dec 03 18:06:41 crc kubenswrapper[4841]: I1203 18:06:41.786953 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r9rbr_fe8b56bd-b492-48cf-a3f2-621b4f58d29c/manager/0.log" Dec 03 18:06:42 crc kubenswrapper[4841]: I1203 18:06:42.238823 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:06:43 crc kubenswrapper[4841]: I1203 18:06:43.216196 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b"} Dec 03 18:07:01 crc kubenswrapper[4841]: I1203 18:07:01.399838 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ng62w_4c07ec09-68a5-4c56-a97a-5eb0a73a020d/control-plane-machine-set-operator/0.log" Dec 03 18:07:01 crc kubenswrapper[4841]: I1203 18:07:01.544947 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q4qff_9bf1c45d-ffb9-423e-bdea-7e2d209a47d1/kube-rbac-proxy/0.log" Dec 03 18:07:01 crc kubenswrapper[4841]: I1203 18:07:01.613209 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q4qff_9bf1c45d-ffb9-423e-bdea-7e2d209a47d1/machine-api-operator/0.log" Dec 03 18:07:14 crc kubenswrapper[4841]: I1203 18:07:14.475859 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qhbrc_b6111350-39b6-4228-a2ac-3cc25ad33c50/cert-manager-controller/0.log" Dec 03 18:07:14 crc kubenswrapper[4841]: I1203 18:07:14.696433 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-pmzsv_ae7ac6c5-8af0-40d3-9b0b-9009819f439d/cert-manager-cainjector/0.log" Dec 03 18:07:14 crc kubenswrapper[4841]: I1203 18:07:14.738830 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2c5w9_6ea86ddf-89eb-471c-b9f5-1fef42cd94cd/cert-manager-webhook/0.log" Dec 03 18:07:27 crc kubenswrapper[4841]: I1203 18:07:27.733247 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-d4ssd_5bdd62f2-102a-4f3a-80aa-e3600df311a9/nmstate-console-plugin/0.log" Dec 03 18:07:27 crc kubenswrapper[4841]: I1203 18:07:27.890651 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-fq2q6_61ca4cad-30b3-4672-ae6c-59fd14e78a4a/kube-rbac-proxy/0.log" Dec 03 18:07:27 crc kubenswrapper[4841]: I1203 18:07:27.926600 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mk7jv_7d600af9-9363-42fd-9b6c-dcf7181dc09b/nmstate-handler/0.log" Dec 03 18:07:27 crc kubenswrapper[4841]: I1203 18:07:27.965860 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-fq2q6_61ca4cad-30b3-4672-ae6c-59fd14e78a4a/nmstate-metrics/0.log" Dec 03 18:07:28 crc kubenswrapper[4841]: I1203 18:07:28.076730 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-qw72m_cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0/nmstate-operator/0.log" Dec 03 18:07:28 crc kubenswrapper[4841]: I1203 18:07:28.136695 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-85htl_0b352e6a-f766-4261-87a1-5e71b591df3b/nmstate-webhook/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.328925 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-w4nm5_c5228882-2889-44e6-8a36-db179d19fe25/kube-rbac-proxy/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.469756 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-w4nm5_c5228882-2889-44e6-8a36-db179d19fe25/controller/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.491256 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.692799 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.713139 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.718945 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:07:42 crc kubenswrapper[4841]: I1203 18:07:42.781501 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:07:43 crc kubenswrapper[4841]: I1203 18:07:43.850630 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:07:43 crc kubenswrapper[4841]: I1203 18:07:43.857749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:07:43 crc kubenswrapper[4841]: I1203 18:07:43.892622 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:07:43 crc kubenswrapper[4841]: I1203 18:07:43.917834 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.092799 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.106768 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.107530 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.116649 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/controller/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.290283 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/kube-rbac-proxy/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.315338 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/frr-metrics/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.345749 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/kube-rbac-proxy-frr/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.537944 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-cncl8_cb37434f-6f72-4e5c-85f5-5e06f1e07692/frr-k8s-webhook-server/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.541767 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/reloader/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.766351 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67f9cc98fc-kcfzm_1fbe0c23-9239-43a3-981a-87b5d6f3af82/manager/0.log" Dec 03 18:07:44 crc kubenswrapper[4841]: I1203 18:07:44.941625 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cc87bb9cb-96fv2_57d5b20b-d392-41ab-8729-d877277201e0/webhook-server/0.log" Dec 03 18:07:45 crc kubenswrapper[4841]: I1203 18:07:45.037003 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj7hl_aaf80919-384d-4751-9ca9-2b9f4994ef1b/kube-rbac-proxy/0.log" Dec 03 18:07:45 crc kubenswrapper[4841]: I1203 18:07:45.640127 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj7hl_aaf80919-384d-4751-9ca9-2b9f4994ef1b/speaker/0.log" Dec 03 18:07:45 crc kubenswrapper[4841]: I1203 18:07:45.653343 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/frr/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.159320 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.354327 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.385333 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.386603 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.550143 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.585475 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/extract/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.594124 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.768945 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.907152 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.913395 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:08:00 crc kubenswrapper[4841]: I1203 18:08:00.927028 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.088717 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.093813 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.105241 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/extract/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.257586 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.435857 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.436979 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.471053 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.656133 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.682989 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.683876 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/extract/0.log" Dec 03 18:08:01 crc kubenswrapper[4841]: I1203 18:08:01.864356 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.012279 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.049049 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.067450 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.226538 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.236170 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.697136 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.793581 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/registry-server/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.924420 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.943839 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:08:02 crc kubenswrapper[4841]: I1203 18:08:02.949237 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.119006 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.194263 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.389878 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qhx4z_b200dd17-70ee-42af-a890-b7f748be7b01/marketplace-operator/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.428264 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.648063 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.676434 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/registry-server/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.704607 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.727958 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.837388 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:08:03 crc kubenswrapper[4841]: I1203 18:08:03.851274 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.005644 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/registry-server/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.472768 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.595661 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.625450 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.650651 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.861349 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:08:04 crc kubenswrapper[4841]: I1203 18:08:04.881409 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:08:05 crc kubenswrapper[4841]: I1203 18:08:05.461245 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/registry-server/0.log" Dec 03 18:08:20 crc kubenswrapper[4841]: I1203 18:08:20.467752 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-f7jbq_ab0ef110-9ded-4408-9f52-0f8bbffd4f25/prometheus-operator/0.log" Dec 03 18:08:20 crc kubenswrapper[4841]: I1203 18:08:20.617379 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4_3f6178c0-01f4-437f-b7bd-bcae5afcec18/prometheus-operator-admission-webhook/0.log" Dec 03 18:08:20 crc kubenswrapper[4841]: I1203 18:08:20.648613 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v_853a7cd4-09bc-40c0-8b4c-3c91fb152dbe/prometheus-operator-admission-webhook/0.log" Dec 03 18:08:20 crc kubenswrapper[4841]: I1203 18:08:20.802047 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rkbb8_123e62f6-3c8c-45f1-993c-12b1be324d9d/operator/0.log" Dec 03 18:08:20 crc kubenswrapper[4841]: I1203 18:08:20.858946 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-v2t4d_f45374d5-3bf6-468b-9d32-be79178468a8/perses-operator/0.log" Dec 03 18:09:09 crc kubenswrapper[4841]: I1203 18:09:09.315980 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:09:09 crc kubenswrapper[4841]: I1203 18:09:09.316481 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:09:39 crc kubenswrapper[4841]: I1203 18:09:39.317314 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:09:39 crc kubenswrapper[4841]: I1203 18:09:39.317978 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:09:50 crc kubenswrapper[4841]: I1203 18:09:50.354967 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerID="5fd1536043d4d8bf2c2f92cae9ef92e6247229de1023805a0ce02aa96c3e960f" exitCode=0 Dec 03 18:09:50 crc kubenswrapper[4841]: I1203 18:09:50.355409 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-zck69/must-gather-vp85t" event={"ID":"0ab7f9dd-4fcb-478c-96e4-890562d77a9f","Type":"ContainerDied","Data":"5fd1536043d4d8bf2c2f92cae9ef92e6247229de1023805a0ce02aa96c3e960f"} Dec 03 18:09:50 crc kubenswrapper[4841]: I1203 18:09:50.356037 4841 scope.go:117] "RemoveContainer" containerID="5fd1536043d4d8bf2c2f92cae9ef92e6247229de1023805a0ce02aa96c3e960f" Dec 03 18:09:50 crc kubenswrapper[4841]: I1203 18:09:50.426584 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zck69_must-gather-vp85t_0ab7f9dd-4fcb-478c-96e4-890562d77a9f/gather/0.log" Dec 03 18:09:58 crc kubenswrapper[4841]: I1203 18:09:58.286080 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-zck69/must-gather-vp85t"] Dec 03 18:09:58 crc kubenswrapper[4841]: I1203 18:09:58.286641 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-zck69/must-gather-vp85t"] Dec 03 18:09:58 crc kubenswrapper[4841]: I1203 18:09:58.287059 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-zck69/must-gather-vp85t" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="copy" containerID="cri-o://e9377811da049f6061617d2dea16a4b4143be0aa46772ae5267f690dd1eb3f67" gracePeriod=2 Dec 03 18:09:58 crc kubenswrapper[4841]: E1203 18:09:58.410236 4841 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0ab7f9dd_4fcb_478c_96e4_890562d77a9f.slice/crio-e9377811da049f6061617d2dea16a4b4143be0aa46772ae5267f690dd1eb3f67.scope\": RecentStats: unable to find data in memory cache]" Dec 03 18:09:58 crc kubenswrapper[4841]: I1203 18:09:58.468512 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zck69_must-gather-vp85t_0ab7f9dd-4fcb-478c-96e4-890562d77a9f/copy/0.log" Dec 03 18:09:58 crc kubenswrapper[4841]: I1203 18:09:58.469215 4841 generic.go:334] "Generic (PLEG): container finished" podID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerID="e9377811da049f6061617d2dea16a4b4143be0aa46772ae5267f690dd1eb3f67" exitCode=143 Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.232750 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zck69_must-gather-vp85t_0ab7f9dd-4fcb-478c-96e4-890562d77a9f/copy/0.log" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.233144 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.258983 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gvrng\" (UniqueName: \"kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng\") pod \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.259141 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output\") pod \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\" (UID: \"0ab7f9dd-4fcb-478c-96e4-890562d77a9f\") " Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.267217 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng" (OuterVolumeSpecName: "kube-api-access-gvrng") pod "0ab7f9dd-4fcb-478c-96e4-890562d77a9f" (UID: "0ab7f9dd-4fcb-478c-96e4-890562d77a9f"). InnerVolumeSpecName "kube-api-access-gvrng". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.363258 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gvrng\" (UniqueName: \"kubernetes.io/projected/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-kube-api-access-gvrng\") on node \"crc\" DevicePath \"\"" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.403838 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0ab7f9dd-4fcb-478c-96e4-890562d77a9f" (UID: "0ab7f9dd-4fcb-478c-96e4-890562d77a9f"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.465743 4841 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0ab7f9dd-4fcb-478c-96e4-890562d77a9f-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.484852 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-zck69_must-gather-vp85t_0ab7f9dd-4fcb-478c-96e4-890562d77a9f/copy/0.log" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.485292 4841 scope.go:117] "RemoveContainer" containerID="e9377811da049f6061617d2dea16a4b4143be0aa46772ae5267f690dd1eb3f67" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.485467 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-zck69/must-gather-vp85t" Dec 03 18:09:59 crc kubenswrapper[4841]: I1203 18:09:59.519189 4841 scope.go:117] "RemoveContainer" containerID="5fd1536043d4d8bf2c2f92cae9ef92e6247229de1023805a0ce02aa96c3e960f" Dec 03 18:10:00 crc kubenswrapper[4841]: I1203 18:10:00.251604 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" path="/var/lib/kubelet/pods/0ab7f9dd-4fcb-478c-96e4-890562d77a9f/volumes" Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.316808 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.317484 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.317546 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.318368 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.318461 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b" gracePeriod=600 Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.600380 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b" exitCode=0 Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.600419 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b"} Dec 03 18:10:09 crc kubenswrapper[4841]: I1203 18:10:09.601085 4841 scope.go:117] "RemoveContainer" containerID="71f84f698412c549b479781d0dda710bb351c1b0c83fabf80480edce0618c195" Dec 03 18:10:10 crc kubenswrapper[4841]: I1203 18:10:10.615940 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3"} Dec 03 18:11:26 crc kubenswrapper[4841]: I1203 18:11:26.839585 4841 scope.go:117] "RemoveContainer" containerID="da93101ddee90b8ecd1d93f11ffa31b6f0db1d7c6a9844378a00dce07e401c16" Dec 03 18:12:09 crc kubenswrapper[4841]: I1203 18:12:09.317165 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:12:09 crc kubenswrapper[4841]: I1203 18:12:09.319508 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:12:39 crc kubenswrapper[4841]: I1203 18:12:39.316558 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:12:39 crc kubenswrapper[4841]: I1203 18:12:39.317293 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.661413 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc867/must-gather-xcsb2"] Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662678 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="copy" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662696 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="copy" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662711 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="gather" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662718 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="gather" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662737 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="extract-content" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662745 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="extract-content" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662763 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="extract-utilities" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662771 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="extract-utilities" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662784 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662791 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662805 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="extract-utilities" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662813 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="extract-utilities" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662823 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662830 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: E1203 18:12:48.662850 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="extract-content" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.662858 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="extract-content" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.663118 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="copy" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.663143 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="60b528b5-410f-4202-a69c-a8f2efec5562" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.663159 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ab7f9dd-4fcb-478c-96e4-890562d77a9f" containerName="gather" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.663168 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="a623b8e4-f243-4eb5-b3da-697cfe9bd18f" containerName="registry-server" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.664513 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.675547 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dc867"/"kube-root-ca.crt" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.675567 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dc867"/"openshift-service-ca.crt" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.691548 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dc867/must-gather-xcsb2"] Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.852128 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.852372 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pdxd\" (UniqueName: \"kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.955015 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.955121 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pdxd\" (UniqueName: \"kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.955571 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.972408 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pdxd\" (UniqueName: \"kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd\") pod \"must-gather-xcsb2\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:48 crc kubenswrapper[4841]: I1203 18:12:48.988324 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:12:49 crc kubenswrapper[4841]: I1203 18:12:49.568004 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dc867/must-gather-xcsb2"] Dec 03 18:12:49 crc kubenswrapper[4841]: I1203 18:12:49.705572 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/must-gather-xcsb2" event={"ID":"a4fc2842-ab60-4853-a50f-3d238c0f5824","Type":"ContainerStarted","Data":"fc9f7aa3d7e68f911d407b7157b34660c5ab0513b694c1a82c7fb13c7cd0ccb5"} Dec 03 18:12:50 crc kubenswrapper[4841]: I1203 18:12:50.719130 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/must-gather-xcsb2" event={"ID":"a4fc2842-ab60-4853-a50f-3d238c0f5824","Type":"ContainerStarted","Data":"624b27619e2d2369691d92c6bfda4a83f7b29592547b2a98c0879139f2a367b4"} Dec 03 18:12:50 crc kubenswrapper[4841]: I1203 18:12:50.720721 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/must-gather-xcsb2" event={"ID":"a4fc2842-ab60-4853-a50f-3d238c0f5824","Type":"ContainerStarted","Data":"3ba9a3536cdcab9e445cd9fc83d5a2db021440cf4646e77ccd185283f6abf6bc"} Dec 03 18:12:50 crc kubenswrapper[4841]: I1203 18:12:50.765064 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dc867/must-gather-xcsb2" podStartSLOduration=2.765035514 podStartE2EDuration="2.765035514s" podCreationTimestamp="2025-12-03 18:12:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:12:50.747513362 +0000 UTC m=+4365.135034119" watchObservedRunningTime="2025-12-03 18:12:50.765035514 +0000 UTC m=+4365.152556281" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.412071 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc867/crc-debug-cg52s"] Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.413641 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.415873 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dc867"/"default-dockercfg-dc5sk" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.604560 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgfhj\" (UniqueName: \"kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.604645 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.707575 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgfhj\" (UniqueName: \"kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.707665 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.707789 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.731553 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgfhj\" (UniqueName: \"kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj\") pod \"crc-debug-cg52s\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: I1203 18:12:55.735647 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:12:55 crc kubenswrapper[4841]: W1203 18:12:55.770779 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b57ff31_9e25_4033_a3ca_f62ede702650.slice/crio-0e91e4c0555a2d0da7e1a62fef218361a97065130416ceecf98874138ffedc47 WatchSource:0}: Error finding container 0e91e4c0555a2d0da7e1a62fef218361a97065130416ceecf98874138ffedc47: Status 404 returned error can't find the container with id 0e91e4c0555a2d0da7e1a62fef218361a97065130416ceecf98874138ffedc47 Dec 03 18:12:56 crc kubenswrapper[4841]: I1203 18:12:56.774783 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/crc-debug-cg52s" event={"ID":"7b57ff31-9e25-4033-a3ca-f62ede702650","Type":"ContainerStarted","Data":"d0ea132a95cbf38315a53d7624204e542dc607f85acc05d9413e90eead5b8ca6"} Dec 03 18:12:56 crc kubenswrapper[4841]: I1203 18:12:56.776187 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/crc-debug-cg52s" event={"ID":"7b57ff31-9e25-4033-a3ca-f62ede702650","Type":"ContainerStarted","Data":"0e91e4c0555a2d0da7e1a62fef218361a97065130416ceecf98874138ffedc47"} Dec 03 18:12:56 crc kubenswrapper[4841]: I1203 18:12:56.790722 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dc867/crc-debug-cg52s" podStartSLOduration=1.790704128 podStartE2EDuration="1.790704128s" podCreationTimestamp="2025-12-03 18:12:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 18:12:56.78875235 +0000 UTC m=+4371.176273077" watchObservedRunningTime="2025-12-03 18:12:56.790704128 +0000 UTC m=+4371.178224855" Dec 03 18:13:06 crc kubenswrapper[4841]: I1203 18:13:06.865419 4841 generic.go:334] "Generic (PLEG): container finished" podID="7b57ff31-9e25-4033-a3ca-f62ede702650" containerID="d0ea132a95cbf38315a53d7624204e542dc607f85acc05d9413e90eead5b8ca6" exitCode=0 Dec 03 18:13:06 crc kubenswrapper[4841]: I1203 18:13:06.865527 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/crc-debug-cg52s" event={"ID":"7b57ff31-9e25-4033-a3ca-f62ede702650","Type":"ContainerDied","Data":"d0ea132a95cbf38315a53d7624204e542dc607f85acc05d9413e90eead5b8ca6"} Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.014819 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.047953 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc867/crc-debug-cg52s"] Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.053188 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host\") pod \"7b57ff31-9e25-4033-a3ca-f62ede702650\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.053305 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host" (OuterVolumeSpecName: "host") pod "7b57ff31-9e25-4033-a3ca-f62ede702650" (UID: "7b57ff31-9e25-4033-a3ca-f62ede702650"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.053330 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgfhj\" (UniqueName: \"kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj\") pod \"7b57ff31-9e25-4033-a3ca-f62ede702650\" (UID: \"7b57ff31-9e25-4033-a3ca-f62ede702650\") " Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.054511 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7b57ff31-9e25-4033-a3ca-f62ede702650-host\") on node \"crc\" DevicePath \"\"" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.058764 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc867/crc-debug-cg52s"] Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.060769 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj" (OuterVolumeSpecName: "kube-api-access-dgfhj") pod "7b57ff31-9e25-4033-a3ca-f62ede702650" (UID: "7b57ff31-9e25-4033-a3ca-f62ede702650"). InnerVolumeSpecName "kube-api-access-dgfhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.156493 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgfhj\" (UniqueName: \"kubernetes.io/projected/7b57ff31-9e25-4033-a3ca-f62ede702650-kube-api-access-dgfhj\") on node \"crc\" DevicePath \"\"" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.251317 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b57ff31-9e25-4033-a3ca-f62ede702650" path="/var/lib/kubelet/pods/7b57ff31-9e25-4033-a3ca-f62ede702650/volumes" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.887766 4841 scope.go:117] "RemoveContainer" containerID="d0ea132a95cbf38315a53d7624204e542dc607f85acc05d9413e90eead5b8ca6" Dec 03 18:13:08 crc kubenswrapper[4841]: I1203 18:13:08.887821 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-cg52s" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.316352 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.316413 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.316460 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.317213 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.317263 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" gracePeriod=600 Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.340516 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dc867/crc-debug-jd8mz"] Dec 03 18:13:09 crc kubenswrapper[4841]: E1203 18:13:09.341190 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b57ff31-9e25-4033-a3ca-f62ede702650" containerName="container-00" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.341214 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b57ff31-9e25-4033-a3ca-f62ede702650" containerName="container-00" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.341588 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b57ff31-9e25-4033-a3ca-f62ede702650" containerName="container-00" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.342732 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.344919 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-dc867"/"default-dockercfg-dc5sk" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.377455 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfd5\" (UniqueName: \"kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.377584 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.479419 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfd5\" (UniqueName: \"kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.479515 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.479687 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.501481 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfd5\" (UniqueName: \"kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5\") pod \"crc-debug-jd8mz\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.665337 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:09 crc kubenswrapper[4841]: W1203 18:13:09.711738 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3941279e_acff_4777_92a6_c324201f0e13.slice/crio-38b5b78db453a90b542697409e81f941699c3deba6f076b1b462dded8416e217 WatchSource:0}: Error finding container 38b5b78db453a90b542697409e81f941699c3deba6f076b1b462dded8416e217: Status 404 returned error can't find the container with id 38b5b78db453a90b542697409e81f941699c3deba6f076b1b462dded8416e217 Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.895382 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/crc-debug-jd8mz" event={"ID":"3941279e-acff-4777-92a6-c324201f0e13","Type":"ContainerStarted","Data":"38b5b78db453a90b542697409e81f941699c3deba6f076b1b462dded8416e217"} Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.897323 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" exitCode=0 Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.897366 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3"} Dec 03 18:13:09 crc kubenswrapper[4841]: I1203 18:13:09.897390 4841 scope.go:117] "RemoveContainer" containerID="745ef60cc95958cd5f6f7a2910bd798110a21beb88577482401df6db3aa0a98b" Dec 03 18:13:09 crc kubenswrapper[4841]: E1203 18:13:09.965786 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:13:10 crc kubenswrapper[4841]: I1203 18:13:10.920712 4841 generic.go:334] "Generic (PLEG): container finished" podID="3941279e-acff-4777-92a6-c324201f0e13" containerID="01217711605fa400256a8798cef8f00b6a6254bda9d34baaf7d5424f42d287d8" exitCode=1 Dec 03 18:13:10 crc kubenswrapper[4841]: I1203 18:13:10.920956 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/crc-debug-jd8mz" event={"ID":"3941279e-acff-4777-92a6-c324201f0e13","Type":"ContainerDied","Data":"01217711605fa400256a8798cef8f00b6a6254bda9d34baaf7d5424f42d287d8"} Dec 03 18:13:10 crc kubenswrapper[4841]: I1203 18:13:10.924644 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:13:10 crc kubenswrapper[4841]: E1203 18:13:10.925304 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:13:10 crc kubenswrapper[4841]: I1203 18:13:10.969126 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc867/crc-debug-jd8mz"] Dec 03 18:13:10 crc kubenswrapper[4841]: I1203 18:13:10.976969 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc867/crc-debug-jd8mz"] Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.022611 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.123794 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slfd5\" (UniqueName: \"kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5\") pod \"3941279e-acff-4777-92a6-c324201f0e13\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.123893 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host\") pod \"3941279e-acff-4777-92a6-c324201f0e13\" (UID: \"3941279e-acff-4777-92a6-c324201f0e13\") " Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.124269 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host" (OuterVolumeSpecName: "host") pod "3941279e-acff-4777-92a6-c324201f0e13" (UID: "3941279e-acff-4777-92a6-c324201f0e13"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.135257 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5" (OuterVolumeSpecName: "kube-api-access-slfd5") pod "3941279e-acff-4777-92a6-c324201f0e13" (UID: "3941279e-acff-4777-92a6-c324201f0e13"). InnerVolumeSpecName "kube-api-access-slfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.225939 4841 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3941279e-acff-4777-92a6-c324201f0e13-host\") on node \"crc\" DevicePath \"\"" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.225981 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slfd5\" (UniqueName: \"kubernetes.io/projected/3941279e-acff-4777-92a6-c324201f0e13-kube-api-access-slfd5\") on node \"crc\" DevicePath \"\"" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.253110 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3941279e-acff-4777-92a6-c324201f0e13" path="/var/lib/kubelet/pods/3941279e-acff-4777-92a6-c324201f0e13/volumes" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.941213 4841 scope.go:117] "RemoveContainer" containerID="01217711605fa400256a8798cef8f00b6a6254bda9d34baaf7d5424f42d287d8" Dec 03 18:13:12 crc kubenswrapper[4841]: I1203 18:13:12.941242 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/crc-debug-jd8mz" Dec 03 18:13:22 crc kubenswrapper[4841]: I1203 18:13:22.239427 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:13:22 crc kubenswrapper[4841]: E1203 18:13:22.240006 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:13:34 crc kubenswrapper[4841]: I1203 18:13:34.241195 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:13:34 crc kubenswrapper[4841]: E1203 18:13:34.241784 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:13:47 crc kubenswrapper[4841]: I1203 18:13:47.239258 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:13:47 crc kubenswrapper[4841]: E1203 18:13:47.240260 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:14:00 crc kubenswrapper[4841]: I1203 18:14:00.239422 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:14:00 crc kubenswrapper[4841]: E1203 18:14:00.240117 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:14:15 crc kubenswrapper[4841]: I1203 18:14:15.239789 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:14:15 crc kubenswrapper[4841]: E1203 18:14:15.241081 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.210521 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:24 crc kubenswrapper[4841]: E1203 18:14:24.211725 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3941279e-acff-4777-92a6-c324201f0e13" containerName="container-00" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.211747 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3941279e-acff-4777-92a6-c324201f0e13" containerName="container-00" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.212144 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3941279e-acff-4777-92a6-c324201f0e13" containerName="container-00" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.214433 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.224860 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.323670 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.323723 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.323783 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jchtj\" (UniqueName: \"kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.426035 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.426092 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.426153 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jchtj\" (UniqueName: \"kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.427275 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.427582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.446362 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jchtj\" (UniqueName: \"kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj\") pod \"community-operators-q5rk9\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:24 crc kubenswrapper[4841]: I1203 18:14:24.558615 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:25 crc kubenswrapper[4841]: I1203 18:14:25.163297 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:25 crc kubenswrapper[4841]: I1203 18:14:25.742040 4841 generic.go:334] "Generic (PLEG): container finished" podID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerID="89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a" exitCode=0 Dec 03 18:14:25 crc kubenswrapper[4841]: I1203 18:14:25.742196 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerDied","Data":"89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a"} Dec 03 18:14:25 crc kubenswrapper[4841]: I1203 18:14:25.742288 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerStarted","Data":"e642b461fe2417653ab7d2fa3135213f5bb52338aa99fc5868a70a09c4321f46"} Dec 03 18:14:25 crc kubenswrapper[4841]: I1203 18:14:25.744225 4841 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.584442 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.586579 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.599994 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.671616 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.671677 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.671801 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6v4v\" (UniqueName: \"kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.773662 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6v4v\" (UniqueName: \"kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.774141 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.774217 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.774609 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.774679 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.793358 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6v4v\" (UniqueName: \"kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v\") pod \"redhat-marketplace-vdvkb\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:26 crc kubenswrapper[4841]: I1203 18:14:26.911217 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:27 crc kubenswrapper[4841]: I1203 18:14:27.385641 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:27 crc kubenswrapper[4841]: W1203 18:14:27.404814 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e01b771_80d1_4aa9_bf48_c137b3aceea8.slice/crio-1c058ddac7ea62a9df4b6518e51e9d1b867c20a47ee7be9b117e49fc1796de88 WatchSource:0}: Error finding container 1c058ddac7ea62a9df4b6518e51e9d1b867c20a47ee7be9b117e49fc1796de88: Status 404 returned error can't find the container with id 1c058ddac7ea62a9df4b6518e51e9d1b867c20a47ee7be9b117e49fc1796de88 Dec 03 18:14:27 crc kubenswrapper[4841]: I1203 18:14:27.762858 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerID="ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30" exitCode=0 Dec 03 18:14:27 crc kubenswrapper[4841]: I1203 18:14:27.763230 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerDied","Data":"ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30"} Dec 03 18:14:27 crc kubenswrapper[4841]: I1203 18:14:27.763375 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerStarted","Data":"1c058ddac7ea62a9df4b6518e51e9d1b867c20a47ee7be9b117e49fc1796de88"} Dec 03 18:14:27 crc kubenswrapper[4841]: I1203 18:14:27.775185 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerStarted","Data":"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4"} Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.289827 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/init-config-reloader/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.460879 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/alertmanager/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.480110 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/init-config-reloader/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.508114 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_00f46b57-a05f-43fe-b97d-1f59137df281/config-reloader/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.679896 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-api/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.688884 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-evaluator/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.731206 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-notifier/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.736213 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_6ece35e7-8cb6-4585-be80-cb4a526af861/aodh-listener/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.783753 4841 generic.go:334] "Generic (PLEG): container finished" podID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerID="b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4" exitCode=0 Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.783839 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerDied","Data":"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4"} Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.789264 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerStarted","Data":"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737"} Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.883609 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c8cf4866-6qqks_a2adaec8-2204-42ce-bc82-2f7e45008cad/barbican-api/0.log" Dec 03 18:14:28 crc kubenswrapper[4841]: I1203 18:14:28.998050 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-57c8cf4866-6qqks_a2adaec8-2204-42ce-bc82-2f7e45008cad/barbican-api-log/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.094940 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745f9599f8-67b5b_ec85a1b0-91a1-4d24-a64b-239c100a7861/barbican-keystone-listener/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.183939 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-745f9599f8-67b5b_ec85a1b0-91a1-4d24-a64b-239c100a7861/barbican-keystone-listener-log/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.307333 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b7bb8bfcf-5cwg2_e2bde16a-a813-49ff-ac28-11cf8d1dfac4/barbican-worker/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.350983 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-b7bb8bfcf-5cwg2_e2bde16a-a813-49ff-ac28-11cf8d1dfac4/barbican-worker-log/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.493698 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-zxgrc_15e8ed9f-b5ae-44bb-b295-1222cdad5513/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.618678 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/ceilometer-central-agent/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.708531 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/proxy-httpd/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.751867 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/ceilometer-notification-agent/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.801277 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerStarted","Data":"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098"} Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.803762 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerID="027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737" exitCode=0 Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.803810 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerDied","Data":"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737"} Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.812147 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_0610d33e-9635-49f9-a9db-f2bbac336470/sg-core/0.log" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.822112 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q5rk9" podStartSLOduration=2.368062809 podStartE2EDuration="5.822088988s" podCreationTimestamp="2025-12-03 18:14:24 +0000 UTC" firstStartedPulling="2025-12-03 18:14:25.744009232 +0000 UTC m=+4460.131529959" lastFinishedPulling="2025-12-03 18:14:29.198035421 +0000 UTC m=+4463.585556138" observedRunningTime="2025-12-03 18:14:29.819157875 +0000 UTC m=+4464.206678602" watchObservedRunningTime="2025-12-03 18:14:29.822088988 +0000 UTC m=+4464.209609715" Dec 03 18:14:29 crc kubenswrapper[4841]: I1203 18:14:29.963398 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c12abe1e-e6a0-4bed-9bab-feb7bf43622d/cinder-api-log/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.058026 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c12abe1e-e6a0-4bed-9bab-feb7bf43622d/cinder-api/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.129922 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b91ca91-fe2e-4f87-948f-f4db1f3a5854/cinder-scheduler/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.242399 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:14:30 crc kubenswrapper[4841]: E1203 18:14:30.242786 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.267364 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_7b91ca91-fe2e-4f87-948f-f4db1f3a5854/probe/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.449563 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-hbf6b_b09b36ac-de85-4fa1-ab95-1c24c0c33c0d/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.709305 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-cvdkt_b0be8d7d-6270-4255-8c6d-6a50f8c741a2/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:30 crc kubenswrapper[4841]: I1203 18:14:30.882723 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/init/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.113877 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/init/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.119456 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-sx49z_dfb9c7ef-19c9-4582-b13c-c399a2ef4e73/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.187448 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-6f6df4f56c-r9xqr_42fb8094-c7e2-45f8-932f-e6b868d4cc38/dnsmasq-dns/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.350566 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3c031e76-65c3-4f33-a588-3a76aa8a2c0b/glance-httpd/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.454156 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_3c031e76-65c3-4f33-a588-3a76aa8a2c0b/glance-log/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.553619 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a0e112d6-6648-48f5-872e-f4ac5e81de4e/glance-httpd/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.656057 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_a0e112d6-6648-48f5-872e-f4ac5e81de4e/glance-log/0.log" Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.849013 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerStarted","Data":"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d"} Dec 03 18:14:31 crc kubenswrapper[4841]: I1203 18:14:31.867039 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vdvkb" podStartSLOduration=2.867933002 podStartE2EDuration="5.867024475s" podCreationTimestamp="2025-12-03 18:14:26 +0000 UTC" firstStartedPulling="2025-12-03 18:14:27.772824582 +0000 UTC m=+4462.160345309" lastFinishedPulling="2025-12-03 18:14:30.771916055 +0000 UTC m=+4465.159436782" observedRunningTime="2025-12-03 18:14:31.866233676 +0000 UTC m=+4466.253754403" watchObservedRunningTime="2025-12-03 18:14:31.867024475 +0000 UTC m=+4466.254545202" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.118782 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-555b568f78-v86bh_8d3ba7b0-6022-4dc8-adf1-4fed513b5a4d/heat-engine/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.240427 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-799f468d8f-qbwl4_839b1e72-e619-4c4c-80ac-0754251beb2a/heat-api/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.300123 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-c5f5d9bb6-ddbgn_bfc25bf9-7fd7-4f92-9dbb-61f291592975/heat-cfnapi/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.377287 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-8z9d7_1e098ac8-ac99-4b82-8723-7171dbb84329/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.403280 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-r5qds_ddd71965-3c25-46fe-a129-4e674bf7dcca/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.595845 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-59cdc4489f-kzmfh_471f221b-da02-49b2-901b-c8afd7aa38c5/keystone-api/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.639731 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29413081-tmxhg_a19d874c-b175-4268-87ae-bec2516b1e1a/keystone-cron/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.727367 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_18b7d958-8083-489d-ab83-9cc342dbad71/kube-state-metrics/0.log" Dec 03 18:14:32 crc kubenswrapper[4841]: I1203 18:14:32.799845 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-s5r8s_fbcfe5cb-55b6-4840-ad0c-a916165933d6/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.029205 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fbf7d7cfc-n8r2b_7e14311f-cc1f-454b-af0b-94a5cf3ed4e3/neutron-api/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.108027 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-6fbf7d7cfc-n8r2b_7e14311f-cc1f-454b-af0b-94a5cf3ed4e3/neutron-httpd/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.309039 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-92stt_390fe67f-4d0d-459c-9f27-d6cc843c2d55/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.634468 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4399b120-7a3b-430f-ad42-21a2c9bd0af5/nova-api-log/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.812802 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_fd4f28bc-9247-408c-a91e-94a0c739bfce/nova-cell0-conductor-conductor/0.log" Dec 03 18:14:33 crc kubenswrapper[4841]: I1203 18:14:33.974500 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4399b120-7a3b-430f-ad42-21a2c9bd0af5/nova-api-api/0.log" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.558762 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.558829 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.618281 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.794751 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_de4fa66a-9d61-40a5-97f9-4c7841e1ca58/nova-cell1-conductor-conductor/0.log" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.826783 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_187b2155-68e5-419d-b438-e22374486ae8/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.935859 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-7s8tk_986f7983-1ff5-4510-a8e9-0e45c0fddd19/nova-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:34 crc kubenswrapper[4841]: I1203 18:14:34.937180 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.084677 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a3d1d17d-16e0-4160-93bc-3a926fedbfbd/nova-metadata-log/0.log" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.427441 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/mysql-bootstrap/0.log" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.441730 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_2a9a9acf-df22-425e-8f3a-4f9d5c5a17b0/nova-scheduler-scheduler/0.log" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.700215 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/galera/0.log" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.713082 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_382c890c-5616-49ab-afd8-59fa071147b4/mysql-bootstrap/0.log" Dec 03 18:14:35 crc kubenswrapper[4841]: I1203 18:14:35.992487 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.546562 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/mysql-bootstrap/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.603668 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_a3d1d17d-16e0-4160-93bc-3a926fedbfbd/nova-metadata-metadata/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.680279 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/galera/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.736982 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_7c3685ed-a2fd-4f00-9452-70f9713117b3/mysql-bootstrap/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.863972 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_3514d2ee-ac27-49e0-a6c1-526c2c7c8bc7/openstackclient/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.911374 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.911817 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.952926 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-cqt22_41fcd3cb-c81b-4a15-bb57-aa38bfa47e41/ovn-controller/0.log" Dec 03 18:14:36 crc kubenswrapper[4841]: I1203 18:14:36.963059 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.102727 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-6vxfx_805e03d0-c25b-4b59-8d0b-d526bc7fcc85/openstack-network-exporter/0.log" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.449674 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server-init/0.log" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.778612 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server-init/0.log" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.805371 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovs-vswitchd/0.log" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.820716 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-cj8qw_b3fab8b5-6122-451f-9b66-a1dbb0813c1b/ovsdb-server/0.log" Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.910261 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q5rk9" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="registry-server" containerID="cri-o://0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098" gracePeriod=2 Dec 03 18:14:37 crc kubenswrapper[4841]: I1203 18:14:37.965394 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.065415 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbd132af-f941-486f-8791-402bae76197f/openstack-network-exporter/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.072437 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-js7nc_14e63bf1-717a-40b6-8c5d-e46bf40c68dc/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.085855 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_cbd132af-f941-486f-8791-402bae76197f/ovn-northd/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.333472 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a85311e5-2270-4d86-a617-1b7da0a346c8/ovsdbserver-nb/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.336435 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_a85311e5-2270-4d86-a617-1b7da0a346c8/openstack-network-exporter/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.383199 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.397729 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.571079 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f9b50c59-2571-4a25-bff5-bc84b18d7315/ovsdbserver-sb/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.576541 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_f9b50c59-2571-4a25-bff5-bc84b18d7315/openstack-network-exporter/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.581944 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jchtj\" (UniqueName: \"kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj\") pod \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.582072 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content\") pod \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.589359 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities\") pod \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\" (UID: \"543d093d-b3e8-4072-a2a4-b8e9414e80b7\") " Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.591140 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities" (OuterVolumeSpecName: "utilities") pod "543d093d-b3e8-4072-a2a4-b8e9414e80b7" (UID: "543d093d-b3e8-4072-a2a4-b8e9414e80b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.591533 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.592006 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj" (OuterVolumeSpecName: "kube-api-access-jchtj") pod "543d093d-b3e8-4072-a2a4-b8e9414e80b7" (UID: "543d093d-b3e8-4072-a2a4-b8e9414e80b7"). InnerVolumeSpecName "kube-api-access-jchtj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.633294 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "543d093d-b3e8-4072-a2a4-b8e9414e80b7" (UID: "543d093d-b3e8-4072-a2a4-b8e9414e80b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.693398 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jchtj\" (UniqueName: \"kubernetes.io/projected/543d093d-b3e8-4072-a2a4-b8e9414e80b7-kube-api-access-jchtj\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.693601 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/543d093d-b3e8-4072-a2a4-b8e9414e80b7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.730801 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59984668d4-h88x4_753fad74-21af-48dd-ae45-1162eb580f22/placement-api/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.923328 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-59984668d4-h88x4_753fad74-21af-48dd-ae45-1162eb580f22/placement-log/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.927645 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/init-config-reloader/0.log" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.929461 4841 generic.go:334] "Generic (PLEG): container finished" podID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerID="0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098" exitCode=0 Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.929914 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerDied","Data":"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098"} Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.929974 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q5rk9" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.929995 4841 scope.go:117] "RemoveContainer" containerID="0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.929976 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q5rk9" event={"ID":"543d093d-b3e8-4072-a2a4-b8e9414e80b7","Type":"ContainerDied","Data":"e642b461fe2417653ab7d2fa3135213f5bb52338aa99fc5868a70a09c4321f46"} Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.959070 4841 scope.go:117] "RemoveContainer" containerID="b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4" Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.982177 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:38 crc kubenswrapper[4841]: I1203 18:14:38.993518 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q5rk9"] Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.028001 4841 scope.go:117] "RemoveContainer" containerID="89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.063039 4841 scope.go:117] "RemoveContainer" containerID="0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098" Dec 03 18:14:39 crc kubenswrapper[4841]: E1203 18:14:39.066744 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098\": container with ID starting with 0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098 not found: ID does not exist" containerID="0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.066816 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098"} err="failed to get container status \"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098\": rpc error: code = NotFound desc = could not find container \"0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098\": container with ID starting with 0041468b8620ef82ca3c1312ebcfee247332448e40ae8c1191c24ea69c46b098 not found: ID does not exist" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.066843 4841 scope.go:117] "RemoveContainer" containerID="b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4" Dec 03 18:14:39 crc kubenswrapper[4841]: E1203 18:14:39.067319 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4\": container with ID starting with b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4 not found: ID does not exist" containerID="b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.067404 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4"} err="failed to get container status \"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4\": rpc error: code = NotFound desc = could not find container \"b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4\": container with ID starting with b0c53f1cb85c6770d0fd57c3c8031ec5c5c513902abb696f23636fb82b2123b4 not found: ID does not exist" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.067485 4841 scope.go:117] "RemoveContainer" containerID="89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a" Dec 03 18:14:39 crc kubenswrapper[4841]: E1203 18:14:39.067830 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a\": container with ID starting with 89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a not found: ID does not exist" containerID="89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.067958 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a"} err="failed to get container status \"89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a\": rpc error: code = NotFound desc = could not find container \"89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a\": container with ID starting with 89f374d56d5d8c150a26015a64a5972fbaffcd158218f2d17557eaefbdf3169a not found: ID does not exist" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.126376 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/init-config-reloader/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.127165 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/config-reloader/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.204320 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/thanos-sidecar/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.227379 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_d9f984e5-9578-43d5-be4a-4ea8f5634547/prometheus/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.332592 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/setup-container/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.538298 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/setup-container/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.557454 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_dbbe92f9-c159-49ce-90ab-dd67ff712b36/rabbitmq/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.585965 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/setup-container/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.930285 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/setup-container/0.log" Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.939385 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vdvkb" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="registry-server" containerID="cri-o://9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d" gracePeriod=2 Dec 03 18:14:39 crc kubenswrapper[4841]: I1203 18:14:39.950741 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_b4eda350-169d-4b70-be32-13d2a1ab1aa3/rabbitmq/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.023880 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-2w4vp_c815359c-145d-48c6-936f-98c8f4cf8fff/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.160445 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-q7w8g_b310e506-2bc4-400e-acd0-749838969d1c/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.278941 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" path="/var/lib/kubelet/pods/543d093d-b3e8-4072-a2a4-b8e9414e80b7/volumes" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.407379 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-ttrpx_c02f43bb-1c6c-45cc-bacc-3b8b0dd514a2/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.509599 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.631492 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6v4v\" (UniqueName: \"kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v\") pod \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.631868 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content\") pod \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.632002 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities\") pod \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\" (UID: \"0e01b771-80d1-4aa9-bf48-c137b3aceea8\") " Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.633040 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities" (OuterVolumeSpecName: "utilities") pod "0e01b771-80d1-4aa9-bf48-c137b3aceea8" (UID: "0e01b771-80d1-4aa9-bf48-c137b3aceea8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.694530 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v" (OuterVolumeSpecName: "kube-api-access-n6v4v") pod "0e01b771-80d1-4aa9-bf48-c137b3aceea8" (UID: "0e01b771-80d1-4aa9-bf48-c137b3aceea8"). InnerVolumeSpecName "kube-api-access-n6v4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.702692 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e01b771-80d1-4aa9-bf48-c137b3aceea8" (UID: "0e01b771-80d1-4aa9-bf48-c137b3aceea8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.733817 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.733866 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e01b771-80d1-4aa9-bf48-c137b3aceea8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.733880 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6v4v\" (UniqueName: \"kubernetes.io/projected/0e01b771-80d1-4aa9-bf48-c137b3aceea8-kube-api-access-n6v4v\") on node \"crc\" DevicePath \"\"" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.739241 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rf2mz_4f7cc5b5-8153-47ba-9f43-5e188c86d8c0/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.956585 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-x8rlr_c05eea2a-71d0-483b-a0b9-92b28743b13e/ssh-known-hosts-edpm-deployment/0.log" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.958798 4841 generic.go:334] "Generic (PLEG): container finished" podID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerID="9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d" exitCode=0 Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.958964 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerDied","Data":"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d"} Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.959080 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vdvkb" event={"ID":"0e01b771-80d1-4aa9-bf48-c137b3aceea8","Type":"ContainerDied","Data":"1c058ddac7ea62a9df4b6518e51e9d1b867c20a47ee7be9b117e49fc1796de88"} Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.959217 4841 scope.go:117] "RemoveContainer" containerID="9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.959034 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vdvkb" Dec 03 18:14:40 crc kubenswrapper[4841]: I1203 18:14:40.982941 4841 scope.go:117] "RemoveContainer" containerID="027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.016276 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.026092 4841 scope.go:117] "RemoveContainer" containerID="ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.043724 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vdvkb"] Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.115633 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6df7dcffd7-hvqxv_8efd46ec-9481-40f8-be85-637ddafc2291/proxy-server/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.122439 4841 scope.go:117] "RemoveContainer" containerID="9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d" Dec 03 18:14:41 crc kubenswrapper[4841]: E1203 18:14:41.124243 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d\": container with ID starting with 9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d not found: ID does not exist" containerID="9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.124299 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d"} err="failed to get container status \"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d\": rpc error: code = NotFound desc = could not find container \"9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d\": container with ID starting with 9cd96ea5e992633f209beeee55d523932a160965853ce6b78a5ff958f73f8a3d not found: ID does not exist" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.124326 4841 scope.go:117] "RemoveContainer" containerID="027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737" Dec 03 18:14:41 crc kubenswrapper[4841]: E1203 18:14:41.132010 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737\": container with ID starting with 027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737 not found: ID does not exist" containerID="027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.132058 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737"} err="failed to get container status \"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737\": rpc error: code = NotFound desc = could not find container \"027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737\": container with ID starting with 027e024aa2b895a1429178ae6181f158984a33a371c45fb670f84e612e8f4737 not found: ID does not exist" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.132086 4841 scope.go:117] "RemoveContainer" containerID="ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30" Dec 03 18:14:41 crc kubenswrapper[4841]: E1203 18:14:41.134403 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30\": container with ID starting with ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30 not found: ID does not exist" containerID="ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.134504 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30"} err="failed to get container status \"ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30\": rpc error: code = NotFound desc = could not find container \"ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30\": container with ID starting with ef791fc5f082562dd868a499d164bd1e989f43156a61aaac6a86961434cdec30 not found: ID does not exist" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.286451 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-6df7dcffd7-hvqxv_8efd46ec-9481-40f8-be85-637ddafc2291/proxy-httpd/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.328896 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-hzcwz_8be83cad-e31a-463f-9eca-837549c69fba/swift-ring-rebalance/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.458542 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-auditor/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.517773 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-reaper/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.558459 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-server/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.585017 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/account-replicator/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.695672 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-auditor/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.813208 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-server/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.831041 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-replicator/0.log" Dec 03 18:14:41 crc kubenswrapper[4841]: I1203 18:14:41.910935 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/container-updater/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.005389 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-auditor/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.056514 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-expirer/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.089559 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-replicator/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.153137 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-server/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.238991 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/object-updater/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.240990 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:14:42 crc kubenswrapper[4841]: E1203 18:14:42.241321 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.250138 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" path="/var/lib/kubelet/pods/0e01b771-80d1-4aa9-bf48-c137b3aceea8/volumes" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.276305 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/rsync/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.364347 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_73a827c8-6b3c-4ffa-9c76-3d3591f38182/swift-recon-cron/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.608545 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-82fk8_902cbae6-47ea-4334-8623-8148bb196870/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:42 crc kubenswrapper[4841]: I1203 18:14:42.663594 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-ldqxf_8880e946-3512-4dfc-9d56-c3210fd50e21/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Dec 03 18:14:50 crc kubenswrapper[4841]: I1203 18:14:50.356039 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_81cb6c96-a5d5-4120-8cb3-101344626b07/memcached/0.log" Dec 03 18:14:53 crc kubenswrapper[4841]: I1203 18:14:53.239395 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:14:53 crc kubenswrapper[4841]: E1203 18:14:53.240224 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.169951 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27"] Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.170969 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.170986 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.171004 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="extract-utilities" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171013 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="extract-utilities" Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.171040 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="extract-content" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171048 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="extract-content" Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.171071 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="extract-content" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171081 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="extract-content" Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.171096 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="extract-utilities" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171105 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="extract-utilities" Dec 03 18:15:00 crc kubenswrapper[4841]: E1203 18:15:00.171134 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171142 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171400 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e01b771-80d1-4aa9-bf48-c137b3aceea8" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.171418 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="543d093d-b3e8-4072-a2a4-b8e9414e80b7" containerName="registry-server" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.172216 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.178173 4841 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.180722 4841 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.188049 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27"] Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.292135 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.292203 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ttm6\" (UniqueName: \"kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.292471 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.394474 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.394580 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ttm6\" (UniqueName: \"kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.394698 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.395714 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.403774 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.433245 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ttm6\" (UniqueName: \"kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6\") pod \"collect-profiles-29413095-rpn27\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:00 crc kubenswrapper[4841]: I1203 18:15:00.496399 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:01 crc kubenswrapper[4841]: I1203 18:15:01.000259 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27"] Dec 03 18:15:01 crc kubenswrapper[4841]: I1203 18:15:01.207417 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" event={"ID":"3e9e8efd-3809-45c8-99ef-6d1b97cc3865","Type":"ContainerStarted","Data":"6dc7653ef430b76e75cfe53b1d46782eebd9607660b2a8ceb7480bbd1c59fa56"} Dec 03 18:15:02 crc kubenswrapper[4841]: I1203 18:15:02.219119 4841 generic.go:334] "Generic (PLEG): container finished" podID="3e9e8efd-3809-45c8-99ef-6d1b97cc3865" containerID="af2923c0d1bdd205a303ba2305c613dc6b01d507ca53bb4a3288fbce6603a03c" exitCode=0 Dec 03 18:15:02 crc kubenswrapper[4841]: I1203 18:15:02.219282 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" event={"ID":"3e9e8efd-3809-45c8-99ef-6d1b97cc3865","Type":"ContainerDied","Data":"af2923c0d1bdd205a303ba2305c613dc6b01d507ca53bb4a3288fbce6603a03c"} Dec 03 18:15:03 crc kubenswrapper[4841]: I1203 18:15:03.889557 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.073625 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ttm6\" (UniqueName: \"kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6\") pod \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.073791 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume\") pod \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.073848 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume\") pod \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\" (UID: \"3e9e8efd-3809-45c8-99ef-6d1b97cc3865\") " Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.074840 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume" (OuterVolumeSpecName: "config-volume") pod "3e9e8efd-3809-45c8-99ef-6d1b97cc3865" (UID: "3e9e8efd-3809-45c8-99ef-6d1b97cc3865"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.080135 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3e9e8efd-3809-45c8-99ef-6d1b97cc3865" (UID: "3e9e8efd-3809-45c8-99ef-6d1b97cc3865"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.080814 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6" (OuterVolumeSpecName: "kube-api-access-8ttm6") pod "3e9e8efd-3809-45c8-99ef-6d1b97cc3865" (UID: "3e9e8efd-3809-45c8-99ef-6d1b97cc3865"). InnerVolumeSpecName "kube-api-access-8ttm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.176671 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ttm6\" (UniqueName: \"kubernetes.io/projected/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-kube-api-access-8ttm6\") on node \"crc\" DevicePath \"\"" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.176712 4841 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.176725 4841 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e9e8efd-3809-45c8-99ef-6d1b97cc3865-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.251629 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.258576 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29413095-rpn27" event={"ID":"3e9e8efd-3809-45c8-99ef-6d1b97cc3865","Type":"ContainerDied","Data":"6dc7653ef430b76e75cfe53b1d46782eebd9607660b2a8ceb7480bbd1c59fa56"} Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.258665 4841 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dc7653ef430b76e75cfe53b1d46782eebd9607660b2a8ceb7480bbd1c59fa56" Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.979745 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr"] Dec 03 18:15:04 crc kubenswrapper[4841]: I1203 18:15:04.987785 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29413050-86wwr"] Dec 03 18:15:06 crc kubenswrapper[4841]: I1203 18:15:06.263667 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3699b8e8-84f3-4772-ad5b-b8b02a370fcc" path="/var/lib/kubelet/pods/3699b8e8-84f3-4772-ad5b-b8b02a370fcc/volumes" Dec 03 18:15:08 crc kubenswrapper[4841]: I1203 18:15:08.239511 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:15:08 crc kubenswrapper[4841]: E1203 18:15:08.240536 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:15:12 crc kubenswrapper[4841]: I1203 18:15:12.770330 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:15:12 crc kubenswrapper[4841]: I1203 18:15:12.994010 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:15:12 crc kubenswrapper[4841]: I1203 18:15:12.999121 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.015920 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.251166 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/pull/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.258019 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/util/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.286434 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_89c2d3247d66cf61ceff1d3a8952ef6bd0d91baa08e77f5024fc5e24eddzblf_dd47aeca-76ee-41e9-8707-43067a97d9ff/extract/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.464129 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hwp8p_a6ef72b8-96de-4545-9100-081f42138dff/manager/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.474202 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgzs7_6dbdda39-de04-49e2-8667-58eb77b076b9/kube-rbac-proxy/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.477610 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-hwp8p_a6ef72b8-96de-4545-9100-081f42138dff/kube-rbac-proxy/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.664631 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-bfvv8_d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8/kube-rbac-proxy/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.699238 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-bfvv8_d5ecd72c-3074-4870-bbe6-f6bfbbe5d5f8/manager/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.700673 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-qgzs7_6dbdda39-de04-49e2-8667-58eb77b076b9/manager/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.919108 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-jmzr6_c969bc4d-df07-4ec7-b406-7de0710faca8/kube-rbac-proxy/0.log" Dec 03 18:15:13 crc kubenswrapper[4841]: I1203 18:15:13.935536 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-jmzr6_c969bc4d-df07-4ec7-b406-7de0710faca8/manager/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.085482 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z8zb5_cae5c7a3-2395-4cfe-93f2-5a7301c52444/kube-rbac-proxy/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.160742 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8jcwp_096189c4-aa40-4a3d-b8df-f8dbfa674e08/kube-rbac-proxy/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.180254 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-z8zb5_cae5c7a3-2395-4cfe-93f2-5a7301c52444/manager/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.303050 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-8jcwp_096189c4-aa40-4a3d-b8df-f8dbfa674e08/manager/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.338274 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-s2b4r_b9bdf600-ace4-4f28-80c9-3dd36cf449ad/kube-rbac-proxy/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.531478 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-s2b4r_b9bdf600-ace4-4f28-80c9-3dd36cf449ad/manager/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.544268 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jfddn_6384ded0-4512-4d89-bef4-004339bb019d/manager/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.558263 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-jfddn_6384ded0-4512-4d89-bef4-004339bb019d/kube-rbac-proxy/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.703114 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fj8w7_0dda1581-f45b-42cd-840f-9b8f2d7a48b1/kube-rbac-proxy/0.log" Dec 03 18:15:14 crc kubenswrapper[4841]: I1203 18:15:14.790551 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-fj8w7_0dda1581-f45b-42cd-840f-9b8f2d7a48b1/manager/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.336078 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-khncz_38a95f1c-87ae-4464-b6fa-ad329d17290e/kube-rbac-proxy/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.350645 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-khncz_38a95f1c-87ae-4464-b6fa-ad329d17290e/manager/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.434670 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-t9pmr_abd88bfa-5c17-4486-a051-50c1ceaafe60/kube-rbac-proxy/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.532577 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-t9pmr_abd88bfa-5c17-4486-a051-50c1ceaafe60/manager/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.604224 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-627sl_36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb/kube-rbac-proxy/0.log" Dec 03 18:15:15 crc kubenswrapper[4841]: I1203 18:15:15.685031 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-627sl_36b8b756-d0a6-4db5-a33d-dcf5b9e77bbb/manager/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.050860 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sl5jm_a253f1cc-d669-490e-9bf4-aff2e95347b0/kube-rbac-proxy/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.121840 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-sl5jm_a253f1cc-d669-490e-9bf4-aff2e95347b0/manager/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.218102 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l56t4_b24334e0-1dd6-4667-8ce1-6013cc71dd7f/kube-rbac-proxy/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.306570 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-l56t4_b24334e0-1dd6-4667-8ce1-6013cc71dd7f/manager/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.348541 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c_7e01626f-e7f3-4c48-bc9b-5d9261b3d89a/kube-rbac-proxy/0.log" Dec 03 18:15:16 crc kubenswrapper[4841]: I1203 18:15:16.416599 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4gdj4c_7e01626f-e7f3-4c48-bc9b-5d9261b3d89a/manager/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.276068 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-ldgf7_dca21fb1-4a3f-4003-a1e2-46c1b191b911/registry-server/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.506762 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6fdc9d4685-8lhfc_91effd10-805a-48e2-a65a-529fe1e33a37/operator/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.519858 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2d7g6_7b153ea5-5794-46c6-a3f3-099b3b45dfef/kube-rbac-proxy/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.675940 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-2d7g6_7b153ea5-5794-46c6-a3f3-099b3b45dfef/manager/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.811765 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q9b79_a4eccc19-eb01-4b44-99e0-041144e4b409/kube-rbac-proxy/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.816490 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-q9b79_a4eccc19-eb01-4b44-99e0-041144e4b409/manager/0.log" Dec 03 18:15:17 crc kubenswrapper[4841]: I1203 18:15:17.946806 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-tv5cl_8a13d35a-d714-4a7f-922b-a6d3a0b580c3/operator/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.007412 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-km89l_70e46d25-a5c6-49b4-b3d5-0828bc234644/kube-rbac-proxy/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.186329 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-km89l_70e46d25-a5c6-49b4-b3d5-0828bc234644/manager/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.243525 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/kube-rbac-proxy/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.465231 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fnjsz_56b976ca-c419-42f4-b063-c0219f4e0a72/kube-rbac-proxy/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.510874 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-fnjsz_56b976ca-c419-42f4-b063-c0219f4e0a72/manager/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.520348 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-65c59f5d56-72jtb_1348de54-9137-400b-b3db-b684d9a03dc4/manager/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.744306 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c7f74d46b-4txld_3448d609-0836-4562-ac6b-03d353471880/manager/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.762368 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r9rbr_fe8b56bd-b492-48cf-a3f2-621b4f58d29c/manager/0.log" Dec 03 18:15:18 crc kubenswrapper[4841]: I1203 18:15:18.807777 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-r9rbr_fe8b56bd-b492-48cf-a3f2-621b4f58d29c/kube-rbac-proxy/0.log" Dec 03 18:15:22 crc kubenswrapper[4841]: I1203 18:15:22.238620 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:15:22 crc kubenswrapper[4841]: E1203 18:15:22.239154 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:15:27 crc kubenswrapper[4841]: I1203 18:15:27.053270 4841 scope.go:117] "RemoveContainer" containerID="f282fe2a8e87d1ee6870a6da682bb8dafbef493020f5e90e24987a965596689a" Dec 03 18:15:35 crc kubenswrapper[4841]: I1203 18:15:35.239467 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:15:35 crc kubenswrapper[4841]: E1203 18:15:35.240369 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:15:40 crc kubenswrapper[4841]: I1203 18:15:40.749485 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-ng62w_4c07ec09-68a5-4c56-a97a-5eb0a73a020d/control-plane-machine-set-operator/0.log" Dec 03 18:15:40 crc kubenswrapper[4841]: I1203 18:15:40.916346 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q4qff_9bf1c45d-ffb9-423e-bdea-7e2d209a47d1/kube-rbac-proxy/0.log" Dec 03 18:15:41 crc kubenswrapper[4841]: I1203 18:15:41.055720 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-q4qff_9bf1c45d-ffb9-423e-bdea-7e2d209a47d1/machine-api-operator/0.log" Dec 03 18:15:47 crc kubenswrapper[4841]: I1203 18:15:47.239743 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:15:47 crc kubenswrapper[4841]: E1203 18:15:47.240536 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:15:54 crc kubenswrapper[4841]: I1203 18:15:54.650193 4841 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-t9pmr" podUID="abd88bfa-5c17-4486-a051-50c1ceaafe60" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.77:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 18:15:58 crc kubenswrapper[4841]: I1203 18:15:58.897465 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-qhbrc_b6111350-39b6-4228-a2ac-3cc25ad33c50/cert-manager-controller/0.log" Dec 03 18:15:59 crc kubenswrapper[4841]: I1203 18:15:59.287762 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-2c5w9_6ea86ddf-89eb-471c-b9f5-1fef42cd94cd/cert-manager-webhook/0.log" Dec 03 18:15:59 crc kubenswrapper[4841]: I1203 18:15:59.323268 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-pmzsv_ae7ac6c5-8af0-40d3-9b0b-9009819f439d/cert-manager-cainjector/0.log" Dec 03 18:16:00 crc kubenswrapper[4841]: I1203 18:16:00.238855 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:16:00 crc kubenswrapper[4841]: E1203 18:16:00.239147 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:16:11 crc kubenswrapper[4841]: I1203 18:16:11.238608 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:16:11 crc kubenswrapper[4841]: E1203 18:16:11.240106 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:16:11 crc kubenswrapper[4841]: I1203 18:16:11.953821 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-d4ssd_5bdd62f2-102a-4f3a-80aa-e3600df311a9/nmstate-console-plugin/0.log" Dec 03 18:16:12 crc kubenswrapper[4841]: I1203 18:16:12.177550 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-fq2q6_61ca4cad-30b3-4672-ae6c-59fd14e78a4a/kube-rbac-proxy/0.log" Dec 03 18:16:12 crc kubenswrapper[4841]: I1203 18:16:12.200881 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-mk7jv_7d600af9-9363-42fd-9b6c-dcf7181dc09b/nmstate-handler/0.log" Dec 03 18:16:12 crc kubenswrapper[4841]: I1203 18:16:12.308434 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-fq2q6_61ca4cad-30b3-4672-ae6c-59fd14e78a4a/nmstate-metrics/0.log" Dec 03 18:16:12 crc kubenswrapper[4841]: I1203 18:16:12.361411 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-qw72m_cb7a8135-8b1d-4ee5-9a6a-8eb26b739cc0/nmstate-operator/0.log" Dec 03 18:16:12 crc kubenswrapper[4841]: I1203 18:16:12.510686 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-85htl_0b352e6a-f766-4261-87a1-5e71b591df3b/nmstate-webhook/0.log" Dec 03 18:16:25 crc kubenswrapper[4841]: I1203 18:16:25.240441 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:16:25 crc kubenswrapper[4841]: E1203 18:16:25.241477 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.498102 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-w4nm5_c5228882-2889-44e6-8a36-db179d19fe25/kube-rbac-proxy/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.562980 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-w4nm5_c5228882-2889-44e6-8a36-db179d19fe25/controller/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.649359 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.807649 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.831398 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.851719 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:16:28 crc kubenswrapper[4841]: I1203 18:16:28.902966 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.072970 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.108418 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.118518 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.209308 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.305529 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-metrics/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.313625 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-reloader/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.314191 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/cp-frr-files/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.367016 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/controller/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.484347 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/kube-rbac-proxy/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.501455 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/frr-metrics/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.593783 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/kube-rbac-proxy-frr/0.log" Dec 03 18:16:29 crc kubenswrapper[4841]: I1203 18:16:29.691369 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/reloader/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.171025 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-cncl8_cb37434f-6f72-4e5c-85f5-5e06f1e07692/frr-k8s-webhook-server/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.241288 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-67f9cc98fc-kcfzm_1fbe0c23-9239-43a3-981a-87b5d6f3af82/manager/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.422375 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-7cc87bb9cb-96fv2_57d5b20b-d392-41ab-8729-d877277201e0/webhook-server/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.644755 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj7hl_aaf80919-384d-4751-9ca9-2b9f4994ef1b/kube-rbac-proxy/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.739634 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r82gk_da26606f-a8a1-42e0-b156-9bd538f20c60/frr/0.log" Dec 03 18:16:30 crc kubenswrapper[4841]: I1203 18:16:30.991262 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vj7hl_aaf80919-384d-4751-9ca9-2b9f4994ef1b/speaker/0.log" Dec 03 18:16:40 crc kubenswrapper[4841]: I1203 18:16:40.238888 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:16:40 crc kubenswrapper[4841]: E1203 18:16:40.240020 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.169157 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.453816 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.455594 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.494227 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.626121 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/util/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.629436 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/pull/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.653056 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fc4knt_1f10118c-20c4-47f3-b078-673dd01ce685/extract/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.788977 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.946256 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.965415 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:16:44 crc kubenswrapper[4841]: I1203 18:16:44.997500 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.120526 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/extract/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.120959 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/pull/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.152594 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c9210kmmmb_9d2b7f0f-9da1-4722-9f7d-851a5a5c5149/util/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.327734 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.496309 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.510681 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.542326 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.704525 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/util/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.705206 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/extract/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.709882 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f83rjxrh_88a07844-4e33-407d-887f-abc37124f7e4/pull/0.log" Dec 03 18:16:45 crc kubenswrapper[4841]: I1203 18:16:45.836531 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.032626 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.045813 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.058589 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.238105 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-content/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.253957 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/extract-utilities/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.462811 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.715133 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:16:46 crc kubenswrapper[4841]: E1203 18:16:46.715968 4841 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e9e8efd-3809-45c8-99ef-6d1b97cc3865" containerName="collect-profiles" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.716096 4841 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e9e8efd-3809-45c8-99ef-6d1b97cc3865" containerName="collect-profiles" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.716369 4841 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e9e8efd-3809-45c8-99ef-6d1b97cc3865" containerName="collect-profiles" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.717966 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.732782 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.736920 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.741359 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-2kk66_e06ffa98-bb06-47e5-ad3a-54d48e4886c8/registry-server/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.758875 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.763370 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.899604 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.899647 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75h2t\" (UniqueName: \"kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.899727 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:46 crc kubenswrapper[4841]: I1203 18:16:46.962702 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-content/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.001596 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.001659 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75h2t\" (UniqueName: \"kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.001795 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.002467 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.002582 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.011110 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/extract-utilities/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.021540 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75h2t\" (UniqueName: \"kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t\") pod \"certified-operators-rlzqx\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.046295 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.320543 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qhx4z_b200dd17-70ee-42af-a890-b7f748be7b01/marketplace-operator/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.385316 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.675409 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.724239 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-rwffh_190f7b14-18cf-4fb0-bfdf-fa21a8ded991/registry-server/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.794710 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.808437 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:16:47 crc kubenswrapper[4841]: I1203 18:16:47.861144 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.025582 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-utilities/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.070271 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/extract-content/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.202816 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-4wkq7_c7f4062c-74e4-424e-9508-2b16e8788201/registry-server/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.224796 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.341496 4841 generic.go:334] "Generic (PLEG): container finished" podID="020ab32c-929d-40dd-9b2f-e5410d22945a" containerID="c3d414ec58873ebb4b4359cfd4d7a81f3298305099461c9843b58f6ea17979a8" exitCode=0 Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.341540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerDied","Data":"c3d414ec58873ebb4b4359cfd4d7a81f3298305099461c9843b58f6ea17979a8"} Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.341565 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerStarted","Data":"67fc38bf90fa6344ea8b243aae23dfef5d0f73790e612f3aa76e157347b8637f"} Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.372524 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.385242 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.403859 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.543022 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-utilities/0.log" Dec 03 18:16:48 crc kubenswrapper[4841]: I1203 18:16:48.575700 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/extract-content/0.log" Dec 03 18:16:49 crc kubenswrapper[4841]: I1203 18:16:49.300992 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-f8fk5_11cfe39b-967a-4099-bd85-414e09e2fc18/registry-server/0.log" Dec 03 18:16:49 crc kubenswrapper[4841]: I1203 18:16:49.352094 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerStarted","Data":"7395ff6f9a019e7546225be370dfdf885c43d10b0224c37f8f53646070624543"} Dec 03 18:16:50 crc kubenswrapper[4841]: I1203 18:16:50.362869 4841 generic.go:334] "Generic (PLEG): container finished" podID="020ab32c-929d-40dd-9b2f-e5410d22945a" containerID="7395ff6f9a019e7546225be370dfdf885c43d10b0224c37f8f53646070624543" exitCode=0 Dec 03 18:16:50 crc kubenswrapper[4841]: I1203 18:16:50.363229 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerDied","Data":"7395ff6f9a019e7546225be370dfdf885c43d10b0224c37f8f53646070624543"} Dec 03 18:16:51 crc kubenswrapper[4841]: I1203 18:16:51.239148 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:16:51 crc kubenswrapper[4841]: E1203 18:16:51.239745 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:16:51 crc kubenswrapper[4841]: I1203 18:16:51.373735 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerStarted","Data":"923b20c519cb6276e0b9ba8293eb35a3117cd819111c1479eb3267b15db99403"} Dec 03 18:16:51 crc kubenswrapper[4841]: I1203 18:16:51.404669 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rlzqx" podStartSLOduration=2.931360916 podStartE2EDuration="5.404647705s" podCreationTimestamp="2025-12-03 18:16:46 +0000 UTC" firstStartedPulling="2025-12-03 18:16:48.344303382 +0000 UTC m=+4602.731824119" lastFinishedPulling="2025-12-03 18:16:50.817590171 +0000 UTC m=+4605.205110908" observedRunningTime="2025-12-03 18:16:51.403328552 +0000 UTC m=+4605.790849279" watchObservedRunningTime="2025-12-03 18:16:51.404647705 +0000 UTC m=+4605.792168452" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.182838 4841 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.186022 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.195251 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.301356 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.301763 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.301966 4841 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d76v9\" (UniqueName: \"kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.403566 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.403705 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.403785 4841 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d76v9\" (UniqueName: \"kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.404244 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.404311 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.436294 4841 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d76v9\" (UniqueName: \"kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9\") pod \"redhat-operators-2kkdw\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.524494 4841 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:16:54 crc kubenswrapper[4841]: I1203 18:16:54.995495 4841 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:16:55 crc kubenswrapper[4841]: W1203 18:16:55.000188 4841 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1bafac9_12dc_4884_8590_cf2e27ca7e13.slice/crio-b548d3e834f1f72a21defef17188109cb23e3bb1fca19a407a6cb802cf2239f4 WatchSource:0}: Error finding container b548d3e834f1f72a21defef17188109cb23e3bb1fca19a407a6cb802cf2239f4: Status 404 returned error can't find the container with id b548d3e834f1f72a21defef17188109cb23e3bb1fca19a407a6cb802cf2239f4 Dec 03 18:16:55 crc kubenswrapper[4841]: I1203 18:16:55.454361 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerStarted","Data":"b548d3e834f1f72a21defef17188109cb23e3bb1fca19a407a6cb802cf2239f4"} Dec 03 18:16:56 crc kubenswrapper[4841]: I1203 18:16:56.464960 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" containerID="4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880" exitCode=0 Dec 03 18:16:56 crc kubenswrapper[4841]: I1203 18:16:56.465037 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerDied","Data":"4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880"} Dec 03 18:16:57 crc kubenswrapper[4841]: I1203 18:16:57.047048 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:57 crc kubenswrapper[4841]: I1203 18:16:57.048451 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:57 crc kubenswrapper[4841]: I1203 18:16:57.128209 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:57 crc kubenswrapper[4841]: I1203 18:16:57.478398 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerStarted","Data":"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb"} Dec 03 18:16:57 crc kubenswrapper[4841]: I1203 18:16:57.541601 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:16:59 crc kubenswrapper[4841]: I1203 18:16:59.454863 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:17:00 crc kubenswrapper[4841]: I1203 18:17:00.508493 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rlzqx" podUID="020ab32c-929d-40dd-9b2f-e5410d22945a" containerName="registry-server" containerID="cri-o://923b20c519cb6276e0b9ba8293eb35a3117cd819111c1479eb3267b15db99403" gracePeriod=2 Dec 03 18:17:02 crc kubenswrapper[4841]: I1203 18:17:02.531981 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" containerID="bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb" exitCode=0 Dec 03 18:17:02 crc kubenswrapper[4841]: I1203 18:17:02.532063 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerDied","Data":"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb"} Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.239778 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:17:03 crc kubenswrapper[4841]: E1203 18:17:03.240195 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.379595 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-f7jbq_ab0ef110-9ded-4408-9f52-0f8bbffd4f25/prometheus-operator/0.log" Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.622142 4841 generic.go:334] "Generic (PLEG): container finished" podID="020ab32c-929d-40dd-9b2f-e5410d22945a" containerID="923b20c519cb6276e0b9ba8293eb35a3117cd819111c1479eb3267b15db99403" exitCode=0 Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.622224 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerDied","Data":"923b20c519cb6276e0b9ba8293eb35a3117cd819111c1479eb3267b15db99403"} Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.640748 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerStarted","Data":"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7"} Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.665065 4841 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2kkdw" podStartSLOduration=2.96231722 podStartE2EDuration="9.665047506s" podCreationTimestamp="2025-12-03 18:16:54 +0000 UTC" firstStartedPulling="2025-12-03 18:16:56.467001508 +0000 UTC m=+4610.854522255" lastFinishedPulling="2025-12-03 18:17:03.169731814 +0000 UTC m=+4617.557252541" observedRunningTime="2025-12-03 18:17:03.658219378 +0000 UTC m=+4618.045740105" watchObservedRunningTime="2025-12-03 18:17:03.665047506 +0000 UTC m=+4618.052568233" Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.774083 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fc7b4b585-94dp4_3f6178c0-01f4-437f-b7bd-bcae5afcec18/prometheus-operator-admission-webhook/0.log" Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.812656 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:17:03 crc kubenswrapper[4841]: I1203 18:17:03.872702 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-fc7b4b585-95r5v_853a7cd4-09bc-40c0-8b4c-3c91fb152dbe/prometheus-operator-admission-webhook/0.log" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.013851 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75h2t\" (UniqueName: \"kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t\") pod \"020ab32c-929d-40dd-9b2f-e5410d22945a\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.013938 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities\") pod \"020ab32c-929d-40dd-9b2f-e5410d22945a\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.014015 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content\") pod \"020ab32c-929d-40dd-9b2f-e5410d22945a\" (UID: \"020ab32c-929d-40dd-9b2f-e5410d22945a\") " Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.014601 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities" (OuterVolumeSpecName: "utilities") pod "020ab32c-929d-40dd-9b2f-e5410d22945a" (UID: "020ab32c-929d-40dd-9b2f-e5410d22945a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.014843 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.020171 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t" (OuterVolumeSpecName: "kube-api-access-75h2t") pod "020ab32c-929d-40dd-9b2f-e5410d22945a" (UID: "020ab32c-929d-40dd-9b2f-e5410d22945a"). InnerVolumeSpecName "kube-api-access-75h2t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.046818 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-rkbb8_123e62f6-3c8c-45f1-993c-12b1be324d9d/operator/0.log" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.049481 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "020ab32c-929d-40dd-9b2f-e5410d22945a" (UID: "020ab32c-929d-40dd-9b2f-e5410d22945a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.108975 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-v2t4d_f45374d5-3bf6-468b-9d32-be79178468a8/perses-operator/0.log" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.117365 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75h2t\" (UniqueName: \"kubernetes.io/projected/020ab32c-929d-40dd-9b2f-e5410d22945a-kube-api-access-75h2t\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.117618 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/020ab32c-929d-40dd-9b2f-e5410d22945a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.525360 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.525400 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.655461 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rlzqx" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.657540 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rlzqx" event={"ID":"020ab32c-929d-40dd-9b2f-e5410d22945a","Type":"ContainerDied","Data":"67fc38bf90fa6344ea8b243aae23dfef5d0f73790e612f3aa76e157347b8637f"} Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.657772 4841 scope.go:117] "RemoveContainer" containerID="923b20c519cb6276e0b9ba8293eb35a3117cd819111c1479eb3267b15db99403" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.692215 4841 scope.go:117] "RemoveContainer" containerID="7395ff6f9a019e7546225be370dfdf885c43d10b0224c37f8f53646070624543" Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.696807 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.718053 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rlzqx"] Dec 03 18:17:04 crc kubenswrapper[4841]: I1203 18:17:04.774554 4841 scope.go:117] "RemoveContainer" containerID="c3d414ec58873ebb4b4359cfd4d7a81f3298305099461c9843b58f6ea17979a8" Dec 03 18:17:05 crc kubenswrapper[4841]: I1203 18:17:05.599463 4841 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2kkdw" podUID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" containerName="registry-server" probeResult="failure" output=< Dec 03 18:17:05 crc kubenswrapper[4841]: timeout: failed to connect service ":50051" within 1s Dec 03 18:17:05 crc kubenswrapper[4841]: > Dec 03 18:17:06 crc kubenswrapper[4841]: I1203 18:17:06.253607 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020ab32c-929d-40dd-9b2f-e5410d22945a" path="/var/lib/kubelet/pods/020ab32c-929d-40dd-9b2f-e5410d22945a/volumes" Dec 03 18:17:14 crc kubenswrapper[4841]: I1203 18:17:14.588727 4841 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:14 crc kubenswrapper[4841]: I1203 18:17:14.653353 4841 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:14 crc kubenswrapper[4841]: I1203 18:17:14.826491 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:17:15 crc kubenswrapper[4841]: I1203 18:17:15.240824 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:17:15 crc kubenswrapper[4841]: E1203 18:17:15.241206 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:17:15 crc kubenswrapper[4841]: I1203 18:17:15.780935 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2kkdw" podUID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" containerName="registry-server" containerID="cri-o://6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7" gracePeriod=2 Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.410419 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.426660 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content\") pod \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.426796 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities\") pod \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.426939 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d76v9\" (UniqueName: \"kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9\") pod \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\" (UID: \"e1bafac9-12dc-4884-8590-cf2e27ca7e13\") " Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.429248 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities" (OuterVolumeSpecName: "utilities") pod "e1bafac9-12dc-4884-8590-cf2e27ca7e13" (UID: "e1bafac9-12dc-4884-8590-cf2e27ca7e13"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.452479 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9" (OuterVolumeSpecName: "kube-api-access-d76v9") pod "e1bafac9-12dc-4884-8590-cf2e27ca7e13" (UID: "e1bafac9-12dc-4884-8590-cf2e27ca7e13"). InnerVolumeSpecName "kube-api-access-d76v9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.528614 4841 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.528955 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d76v9\" (UniqueName: \"kubernetes.io/projected/e1bafac9-12dc-4884-8590-cf2e27ca7e13-kube-api-access-d76v9\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.566377 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e1bafac9-12dc-4884-8590-cf2e27ca7e13" (UID: "e1bafac9-12dc-4884-8590-cf2e27ca7e13"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.631088 4841 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e1bafac9-12dc-4884-8590-cf2e27ca7e13-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.791479 4841 generic.go:334] "Generic (PLEG): container finished" podID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" containerID="6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7" exitCode=0 Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.791523 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerDied","Data":"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7"} Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.791550 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2kkdw" event={"ID":"e1bafac9-12dc-4884-8590-cf2e27ca7e13","Type":"ContainerDied","Data":"b548d3e834f1f72a21defef17188109cb23e3bb1fca19a407a6cb802cf2239f4"} Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.791545 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2kkdw" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.791587 4841 scope.go:117] "RemoveContainer" containerID="6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.816729 4841 scope.go:117] "RemoveContainer" containerID="bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.842002 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.850434 4841 scope.go:117] "RemoveContainer" containerID="4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.870001 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2kkdw"] Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.904900 4841 scope.go:117] "RemoveContainer" containerID="6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7" Dec 03 18:17:16 crc kubenswrapper[4841]: E1203 18:17:16.908015 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7\": container with ID starting with 6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7 not found: ID does not exist" containerID="6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.908073 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7"} err="failed to get container status \"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7\": rpc error: code = NotFound desc = could not find container \"6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7\": container with ID starting with 6df81add29aa5029a940fb864b9265cb6dcca7de2d0a5d9a0824678841b465e7 not found: ID does not exist" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.908104 4841 scope.go:117] "RemoveContainer" containerID="bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb" Dec 03 18:17:16 crc kubenswrapper[4841]: E1203 18:17:16.908479 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb\": container with ID starting with bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb not found: ID does not exist" containerID="bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.908519 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb"} err="failed to get container status \"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb\": rpc error: code = NotFound desc = could not find container \"bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb\": container with ID starting with bf82a07c2e1a107a710b0743f30b1b582a111ce5b7ae119469ca1423efdf0efb not found: ID does not exist" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.908547 4841 scope.go:117] "RemoveContainer" containerID="4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880" Dec 03 18:17:16 crc kubenswrapper[4841]: E1203 18:17:16.908895 4841 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880\": container with ID starting with 4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880 not found: ID does not exist" containerID="4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880" Dec 03 18:17:16 crc kubenswrapper[4841]: I1203 18:17:16.908928 4841 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880"} err="failed to get container status \"4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880\": rpc error: code = NotFound desc = could not find container \"4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880\": container with ID starting with 4c310a386583c8fd604b79864077cbcf7255b9c95ed26a426034c0e82ac74880 not found: ID does not exist" Dec 03 18:17:18 crc kubenswrapper[4841]: I1203 18:17:18.249652 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1bafac9-12dc-4884-8590-cf2e27ca7e13" path="/var/lib/kubelet/pods/e1bafac9-12dc-4884-8590-cf2e27ca7e13/volumes" Dec 03 18:17:27 crc kubenswrapper[4841]: I1203 18:17:27.240239 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:17:27 crc kubenswrapper[4841]: E1203 18:17:27.241320 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:17:41 crc kubenswrapper[4841]: I1203 18:17:41.239119 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:17:41 crc kubenswrapper[4841]: E1203 18:17:41.239960 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:17:55 crc kubenswrapper[4841]: I1203 18:17:55.239520 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:17:55 crc kubenswrapper[4841]: E1203 18:17:55.240532 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:18:06 crc kubenswrapper[4841]: I1203 18:18:06.254893 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:18:06 crc kubenswrapper[4841]: E1203 18:18:06.256776 4841 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-c9kmk_openshift-machine-config-operator(2cd214d0-d838-44a7-8a1a-ef7855cc1bd3)\"" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" Dec 03 18:18:21 crc kubenswrapper[4841]: I1203 18:18:21.239870 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3" Dec 03 18:18:21 crc kubenswrapper[4841]: I1203 18:18:21.552805 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"496056ffbb2bbe3e48c8550f8dfc63bd92857de2e5d93353fdc069d7a4fef52a"} Dec 03 18:18:34 crc kubenswrapper[4841]: I1203 18:18:34.702178 4841 generic.go:334] "Generic (PLEG): container finished" podID="a4fc2842-ab60-4853-a50f-3d238c0f5824" containerID="3ba9a3536cdcab9e445cd9fc83d5a2db021440cf4646e77ccd185283f6abf6bc" exitCode=0 Dec 03 18:18:34 crc kubenswrapper[4841]: I1203 18:18:34.702260 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dc867/must-gather-xcsb2" event={"ID":"a4fc2842-ab60-4853-a50f-3d238c0f5824","Type":"ContainerDied","Data":"3ba9a3536cdcab9e445cd9fc83d5a2db021440cf4646e77ccd185283f6abf6bc"} Dec 03 18:18:34 crc kubenswrapper[4841]: I1203 18:18:34.703392 4841 scope.go:117] "RemoveContainer" containerID="3ba9a3536cdcab9e445cd9fc83d5a2db021440cf4646e77ccd185283f6abf6bc" Dec 03 18:18:34 crc kubenswrapper[4841]: I1203 18:18:34.787944 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc867_must-gather-xcsb2_a4fc2842-ab60-4853-a50f-3d238c0f5824/gather/0.log" Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.373630 4841 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dc867/must-gather-xcsb2"] Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.374515 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dc867/must-gather-xcsb2" podUID="a4fc2842-ab60-4853-a50f-3d238c0f5824" containerName="copy" containerID="cri-o://624b27619e2d2369691d92c6bfda4a83f7b29592547b2a98c0879139f2a367b4" gracePeriod=2 Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.385985 4841 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dc867/must-gather-xcsb2"] Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.859816 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc867_must-gather-xcsb2_a4fc2842-ab60-4853-a50f-3d238c0f5824/copy/0.log" Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.861223 4841 generic.go:334] "Generic (PLEG): container finished" podID="a4fc2842-ab60-4853-a50f-3d238c0f5824" containerID="624b27619e2d2369691d92c6bfda4a83f7b29592547b2a98c0879139f2a367b4" exitCode=143 Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.994924 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc867_must-gather-xcsb2_a4fc2842-ab60-4853-a50f-3d238c0f5824/copy/0.log" Dec 03 18:18:45 crc kubenswrapper[4841]: I1203 18:18:45.995613 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.136184 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output\") pod \"a4fc2842-ab60-4853-a50f-3d238c0f5824\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.136234 4841 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pdxd\" (UniqueName: \"kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd\") pod \"a4fc2842-ab60-4853-a50f-3d238c0f5824\" (UID: \"a4fc2842-ab60-4853-a50f-3d238c0f5824\") " Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.159209 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd" (OuterVolumeSpecName: "kube-api-access-8pdxd") pod "a4fc2842-ab60-4853-a50f-3d238c0f5824" (UID: "a4fc2842-ab60-4853-a50f-3d238c0f5824"). InnerVolumeSpecName "kube-api-access-8pdxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.237948 4841 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8pdxd\" (UniqueName: \"kubernetes.io/projected/a4fc2842-ab60-4853-a50f-3d238c0f5824-kube-api-access-8pdxd\") on node \"crc\" DevicePath \"\"" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.309254 4841 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "a4fc2842-ab60-4853-a50f-3d238c0f5824" (UID: "a4fc2842-ab60-4853-a50f-3d238c0f5824"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.340095 4841 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a4fc2842-ab60-4853-a50f-3d238c0f5824-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.873385 4841 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dc867_must-gather-xcsb2_a4fc2842-ab60-4853-a50f-3d238c0f5824/copy/0.log" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.874233 4841 scope.go:117] "RemoveContainer" containerID="624b27619e2d2369691d92c6bfda4a83f7b29592547b2a98c0879139f2a367b4" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.874346 4841 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dc867/must-gather-xcsb2" Dec 03 18:18:46 crc kubenswrapper[4841]: I1203 18:18:46.926177 4841 scope.go:117] "RemoveContainer" containerID="3ba9a3536cdcab9e445cd9fc83d5a2db021440cf4646e77ccd185283f6abf6bc" Dec 03 18:18:48 crc kubenswrapper[4841]: I1203 18:18:48.258038 4841 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4fc2842-ab60-4853-a50f-3d238c0f5824" path="/var/lib/kubelet/pods/a4fc2842-ab60-4853-a50f-3d238c0f5824/volumes" Dec 03 18:20:39 crc kubenswrapper[4841]: I1203 18:20:39.316030 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:20:39 crc kubenswrapper[4841]: I1203 18:20:39.316743 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:21:09 crc kubenswrapper[4841]: I1203 18:21:09.316103 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:21:09 crc kubenswrapper[4841]: I1203 18:21:09.316849 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.316066 4841 patch_prober.go:28] interesting pod/machine-config-daemon-c9kmk container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.316611 4841 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.316693 4841 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.317767 4841 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"496056ffbb2bbe3e48c8550f8dfc63bd92857de2e5d93353fdc069d7a4fef52a"} pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.317861 4841 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" podUID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerName="machine-config-daemon" containerID="cri-o://496056ffbb2bbe3e48c8550f8dfc63bd92857de2e5d93353fdc069d7a4fef52a" gracePeriod=600 Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.846898 4841 generic.go:334] "Generic (PLEG): container finished" podID="2cd214d0-d838-44a7-8a1a-ef7855cc1bd3" containerID="496056ffbb2bbe3e48c8550f8dfc63bd92857de2e5d93353fdc069d7a4fef52a" exitCode=0 Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.846963 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerDied","Data":"496056ffbb2bbe3e48c8550f8dfc63bd92857de2e5d93353fdc069d7a4fef52a"} Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.847235 4841 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-c9kmk" event={"ID":"2cd214d0-d838-44a7-8a1a-ef7855cc1bd3","Type":"ContainerStarted","Data":"bbaf4c45c9ca40157fd80670c6650f9e7f7a0dc87ba06205424f407e1506188d"} Dec 03 18:21:39 crc kubenswrapper[4841]: I1203 18:21:39.847258 4841 scope.go:117] "RemoveContainer" containerID="651b6ae9bf8fd7a606337d089b7cc9902cb613d11065596960bae53cb6a39ea3"